Child pages
  • HTTP Async Example
Skip to end of metadata
Go to start of metadata

HTTP Async Example

This example shows how to use the new feature in Camel 2.1 which is support for non blocking asynchronous producers by ToAsync.

Currently camel-jetty implements this to the fullest as its JettyHttpProducer supports non blocking request/reply natively in Jetty.

This example shows a client and a server in action. The client sends 100 messages to the server over HTTP which the server processes and returns a reply.

The client is working using a single threaded to route the messages to the point where they are send to the HTTP server. As we use non blocking asynchronous Request reply this single thread will terminate its current task and be ready immediately to route the next message. This allows us to have higher throughput as the single thread can go as fast as it can, it does not have to wait for the HTTP server to reply (i.e. its not blocking).

You can see the difference if you change the async=true option to async=false in the src/main/resources/META-INF/spring/camel-client.xml file.


You will need to compile this example first:

The example should run if you type:

To stop the server hit ctrl + c

Sample output

When the client is running it outputs all requests and responses on the screen.

As the client is single threaded it will send the messages in order, e.g. from 0 to 99.

As the HTTP server is simulating some time to process each message its replies will likely come after all the client have send all 100 messages. When they arrive they come back out of order

And as you can see they are being handled by different threads, as we have configured using a poolSize=10 option.

Running synchronous

If we on the other hand change to synchronous mode, that means we will use the single thread for the entire routing and it will be blocked while waiting for the reply from the HTTP server. To see this in action change the async="true" to async="false".

The output is then as expected a request, reply and so forth. And of course the throughput is much lower as we are only handle a single message at the time and blocked while waiting for the HTTP server reply.

See Also

  • No labels