CXF Async Example
This example shows how to use the new feature in Camel 2.1 which is support for non blocking asynchronous producers by ToAsync.
Currently camel-jetty implements this to the fullest as its
JettyHttpProducer supports non blocking request/reply natively in Jetty. However in cases its not natively supported in the
Producer then Camel core will fallback to simulate non blocking request/response where it handles the sending the request in another thread pool. This ensures the original thread will still not block and it appears as asynchronous request/reply.
This example shows a client and a server in action. The client sends 100 webservice messages to the server over CXF which the server processes and returns a reply.
The client is working using a single threaded to route the messages to the point where they are send to the webserver. As we use non blocking asynchronous Request reply this single thread will terminate its current task and be ready immediately to route the next message. This allows us to have higher throughput as the single thread can go as fast as it can, it does not have to wait for the webserver to reply (i.e. its not blocking).
You can see the difference if you change the
async=true option to
async=false in the
You will need to compile this example first:
The example should run if you type:
To stop the server hit ctrl + c
When the client is running it outputs all requests and responses on the screen.
As the client is single threaded it will send the messages in order, e.g. from 0 to 99.
As the HTTP server is simulating some time to process each message its replies will likely come after all the client have send all 100 messages. When they arrive they come back out of order
And as you can see they are being handled by different threads, as we have configured using a
If we on the other hand change to synchronous mode, that means we will use the single thread for the entire routing and it will be blocked while waiting for the reply from the webserver. To see this in action change the
The output is then as expected a request, reply and so forth. And of course the throughput is much lower as we are only handle a single message at the time and blocked while waiting for the webserver reply.