Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 5.3

HTTP Async Example

This example shows how to use the new feature in Camel 2.1 which is support for non blocking asynchronous producers by ToAsync.

Currently camel-jetty implements this to the fullest as its JettyHttpProducer supports non blocking request/reply natively in Jetty.

This example shows a client and a server in action. The client sends 100 messages to the server over HTTP which the server processes and returns a reply.

The client is working using a single threaded to route the messages to the point where they are send to the HTTP server. As we use non blocking asynchronous Request reply this single thread will terminate its current task and be ready immediately to route the next message. This allows us to have higher throughput as the single thread can go as fast as it can, it does not have to wait for the HTTP server to reply (i.e. its not blocking).

You can see the difference if you change the async=true option to async=false in the src/main/resources/META-INF/spring/camel-client.xml file.

Running

You will need to compile this example first:

Code Block
  mvn compile

The example should run if you type:

Code Block
  mvn exec:java -PCamelServer

  mvn exec:java -PCamelClient

To stop the server hit ctrl + c

Sample output

When the client is running it outputs all requests and responses on the screen.

As the client is single threaded it will send the messages in order, e.g. from 0 to 99.

Code Block
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 0]
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 1]
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 2]
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 3]
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 4]
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 5]
[          org.apache.camel.example.client.CamelClient.main()] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 6]
...

As the HTTP server is simulating some time to process each message its replies will likely come after all the client have send all 100 messages. When they arrive they come back out of order

Code Block
[  Camel thread 2: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 7]
[  Camel thread 3: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 27]
[  Camel thread 1: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 2]
[  Camel thread 6: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 5]
[  Camel thread 4: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 3]
[  Camel thread 7: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 28]
[  Camel thread 5: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 24]
[  Camel thread 8: ToAsync[jetty://http://0.0.0.0:9123/myapp]] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 9]

And as you can see they are being handled by different threads, as we have configured using a poolSize=10 option.

Running synchronous

If we on the other hand change to synchronous mode, that means we will use the single thread for the entire routing and it will be blocked while waiting for the reply from the HTTP server. To see this in action change the async="true" to async="false".

The output is then as expected a request, reply and so forth. And of course the throughput is much lower as we are only handle a single message at the time and blocked while waiting for the HTTP server reply.

Code Block
[                                                        main] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 4]
[                                                        main] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 4]
[                                                        main] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 5]
[                                                        main] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 5]
[                                                        main] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 6]
[                                                        main] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 6]
[                                                        main] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 7]
[                                                        main] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 7]
[                                                        main] +++ request +++      INFO  Exchange[BodyType:String, Body:Message 8]
[                                                        main] +++  reply  +++      INFO  Exchange[BodyType:String, Body:Bye Message 8]

See Also