2

I am new to node.js and in the position where I am using Node.js for a single page app. I have to occasionally do some very complex data manipulation (for AI and statistical functions). These services are written in Python. I am trying to figure out the best way to hand off these computations to a child process so my main thread is not blocked and I can use the abundance of python libraries available.

I need to be able to pass complex data (a JSON object is fine) to the child process and receive complex output (again a JSON object is ideal). Referring to Node.js math computation - child process and complex data?, this is simple enough with another node.js process using child_process.fork() and .send().

Is there a similar method in child_process.spawn()? Or is it possible to use .fork() to execute a python script. All I can figure out is how to execute system commands which don't allow but the most simple arguments. Even if I can send the data, I am unclear about how to receive data using .spawn().

I couldn't find this answer on SO, but I admit node.js is new to me.

Thank you.

Community
  • 1
  • 1
Apollo
  • 945
  • 2
  • 9
  • 24

2 Answers2

2

The spawn method returns a ChildProcess object, which gives you access to the stdin and stdout of the process.

It would look something like this:

var child = require('child_process').spawn('python', ['script.py']);
child.stdout.on('data', function(data)
{
    response = JSON.parse(data);
    console.log(response);
});
child.stdin.write(JSON.stringify(data));
Sean Fujiwara
  • 4,506
  • 22
  • 34
1

As reptilicus noted, use of ZeroMQ for integrating node.js and Python is an option.

In my SO answer is an example of easy client server application communicating over ZeroMQ. Implementation for client part in node.js shall be very similar, node.js for zeromq binding exists and complexity (or simplicity) of node.js side code is comparable to the Python one. (note, the link does not provide exact solution as in Python, it just illustrates how simple it is to do similar task).

Community
  • 1
  • 1
Jan Vlcinsky
  • 42,725
  • 12
  • 101
  • 98
  • I've never heard of this. It seems interesting, similar to Apache Thrift if I get it at first glance. Is it beneficial to use when both processes are on the same machine? – Apollo May 07 '14 at 20:18
  • @Apollo Yes, it integrates very well. On the same machine you can even use unix sockets. For different machines over TCP. If you ever tried all kind of iPython clients (console, GUI, web), you could wonder how it is integrated. The answer is zeromq. – Jan Vlcinsky May 07 '14 at 20:19
  • I appreciate this, but I feel like it may add more complexity to my project than I want to deal with right now. It's one of those "build for the future or deadline" paradoxes. – Apollo May 07 '14 at 20:33
  • @Apollo I understand your view. But I can assure you, there are just two groups of programmers, those, who did not try zeromq yet, and those who use it often, because it allows interconnecting so easily. You will see once you try. Personally I would be concerned about integrating over spawning processes, what is much more tricky and has order of magnitude worse performance. – Jan Vlcinsky May 07 '14 at 20:46
  • I'll definitely give it a try. – Apollo May 07 '14 at 20:49