I'm building an HTTP Proxy in node.js. When the incoming request meets some conditions, a long running job is executed. When this happens, all the subsequent requests must wait for the job to end (due to the node's architecture):
function proxy(request, response) {
if(isSpecial(request)) {
// Long running job
}
// Proxy request
}
This is not good.
So let's say that the long running job can be implemented in Java, and for this purpose I build a Java server application that executes the long running job in a separate thread every time a request is made by the node application.
So, when the conditions are met, node.js makes a connection (TCP, HTTP, whatever) to the Java server. The Java server initializes a new Thread for the request, executes the long running job in this separate thread, and returns back, let's say, a JSON response (can be binary, whatever) that node can easily, asynchronously, handle:
var javaServer = initJavaServer(); // pseudo-code
function proxy(request, response) {
var special = isSpecial(request);
if (special) {
var jobResponse;
javaServer.request( ... );
javaServer.addListener("data", function(chunk)) {
// Read response
// jobResponse = ...
}
javaServer.addListener("end", function(jobResult)) {
doProxy(jobResponse, request, response);
}
} else {
doProxy(null, request, response);
}
}
In this way, I can execute long running jobs for those request that meet the conditions, without blocking the whole node application.
So here the requirements are:
- Speed
- Scalability of both apps (the node proxy runs on a cluster, and the Java app on another one)
Maybe a messaging broker service like RabbitMQ may help (node pushes messages, Java subscribes to them and pushes the response back).
Thoughts?