53

The issue is:

  • Lets assume we have two Node.js processes running: example1.js and example2.js.

  • In example1.js there is function func1(input) which returns result1 as a result.

  • Is there a way from within example2.js to call func1(input) and obtain result1 as the outcome?

From what I've learned about Node.js, I have only found one solution which uses sockets for communication. This is less than ideal however because it would require one process listening on a port. If possible I wish to avoid that.


EDIT: After some questions I'd love to add that in hierarchy example1.js cannot be child process of example2.js, but rather the opposite. Also if it helps -- there can be only one example1.js processing its own data and many example2.js's processing own data + data from first process.

Community
  • 1
  • 1
Alexey Kamenskiy
  • 2,888
  • 5
  • 36
  • 56

3 Answers3

53

The use case you describe makes me think of dnode, with which you can easily expose functions to be called by different processes, coordinated by dnode, which uses network sockets (and socket.io, so you can use the same mechanism in the browser).

Another approach would be to use a message queue, there are many good bindings for different message queues.

The simplest way to my knowledge, is to use child_process.fork():

This is a special case of the spawn() functionality for spawning Node processes. In addition to having all the methods in a normal ChildProcess instance, the returned object has a communication channel built-in. The channel is written to with child.send(message, [sendHandle]) and messages are received by a 'message' event on the child.

So, for your example, you could have example2.js:

var fork = require('child_process').fork;
var example1 = fork(__dirname + '/example1.js');

example1.on('message', function(response) {
  console.log(response);
});

example1.send({func: 'input'});

And example1.js:

function func(input) {
  process.send('Hello ' + input);
}

process.on('message', function(m) {
  func(m);
});
Linus Thiel
  • 38,647
  • 9
  • 109
  • 104
  • I afraid that this isn't a case here. Both processes should completely independent from one another and from hierarchy i'd say that process which will call function in another would be rather child process. Or am i mistaken here? – Alexey Kamenskiy Apr 18 '12 at 16:33
  • 1
    If you need independent processes talking to each other I recommend a message queue, I like to use redis myself but there are [good bindings for many message queues](https://github.com/joyent/node/wiki/modules#wiki-message-queue). – Linus Thiel Apr 18 '12 at 16:35
  • The thing is that i want to avoid any intermediates if possible (due to performance). When in case of message queue i will have to use one. In original task there will be only one 'example1.js' (which will process its own data) and many 'example2.js' which will process their own data and use data from 'example1.js'. – Alexey Kamenskiy Apr 18 '12 at 16:43
  • In that case, I'm afraid I can't help you out. [IPC](http://en.wikipedia.org/wiki/Inter-process_communication) always comes with some overhead, since data must be copied between processes somehow. Perhaps network sockets are worse for performance than other approaches -- I honestly don't know. – Linus Thiel Apr 18 '12 at 16:55
  • 1
    So this is why i asked this question. I am sure it is possible to do with sockets and intermediates, but i am looking for best solution from performance perspective since both data traffic and frequency of calls can be huge there. – Alexey Kamenskiy Apr 18 '12 at 17:06
  • In that case, if I were you, I would do a test with Unix sockets and if that didn't work, I would either not use Node.js or write a C++ module which uses shared memory and locking (which I really wouldn't, since I'm afraid of threading/locking ;-P ). It sounds like your performance requirements are such that Node.js is not a good fit, as it's very focused on parallel, mostly independent, processes. – Linus Thiel Apr 18 '12 at 17:17
  • I see your edit to the question. In that case, why not make example1.js `fork()` multiple example2.js, and use the same pattern I suggested initially? – Linus Thiel Apr 18 '12 at 17:22
  • I am not sure -- can child process initiate a message to parent? – Alexey Kamenskiy Apr 18 '12 at 17:26
  • 4
    This is exactly what IPC exists for. You're asking how to do inter-process communication without an IPC mechanism or an intermediary. If processes were allowed to communicate with the data in other non-child processes without a middle-man, you would have huge security issues. I agree you should either use a message queue or redis. – Joseph Yaduvanshi Apr 18 '12 at 17:41
  • @AlexKey: I'm pretty sure I linked to the documentation ;) -- but, yeah. Both processes can send messages. – Linus Thiel Apr 18 '12 at 17:59
  • That's definitely the way to go if processes are nested! – Joseph Yaduvanshi Apr 18 '12 at 19:53
  • Hm. I get a weird encoding error when I try to use fork, but not with spawn. Amazing. – LJD Jul 01 '19 at 00:35
  • !DNODE IS DEPRECATED! Do not use this library as it is no more maintained and does not work throwing an error `Error: Could not locate the bindings file` from weak module. – VityaSchel Sep 17 '21 at 06:38
6

May be you should try Messenger.js. It can do IPC in a handy way.

You don't have to do the communication between the two processes by yourself.

Bertrand Marron
  • 21,501
  • 8
  • 58
  • 94
Kaicui
  • 3,795
  • 1
  • 15
  • 20
6

Use Redis as a message bus/broker.

https://redis.io/topics/pubsub

You can also use socket messaging like ZeroMQ, which are point to point / peer to peer, instead of using a message broker like Redis.

How does this work?

With Redis, in both your node applications you have two Redis clients doing pub/sub. So each node.js app would have a publisher and subscriber client (yes you need 2 clients per node process for Redis pub/sub)

With ZeroMQ, you can send messages via IPC channels, directly between node.js processes, (no broker involved - except perhaps the OS itself..).

Alexander Mills
  • 90,741
  • 139
  • 482
  • 817
  • 1
    only problem with pubsub for me is that if server A wants to send something to server B. All N servers have to get lit. What is the solution there? – Muhammad Umer Oct 21 '18 at 19:35
  • only problem is I have to do JSON.stringify and JSON.parse I know redis has no concept of objects but I have data coming in literally every millisecond, wouldnt JSON.parse and JSON.stringify slow things down big time? – PirateApp Nov 28 '18 at 15:20
  • 1
    @Muhammad Umer no just the redis server is hit, and redis forwards the message to A or B or C but not all 3, etc. Redis only forwards the message to those that have subscribed to a certain namespace. – Alexander Mills Nov 28 '18 at 20:38