16

I have a root server running with several node.js projects on it. They are supposed to run separately in their own processes and directories. Consider this file structure:

/home
+-- /node
    +-- /someProject      | www.some-project.com
    |   +-- index.js
    |   +-- anotherFile.img
    |   +-- ...
    +-- /anotherProject   | www.another-project.com
    |   +-- /stuff
    |   +-- index.js
    |   +-- ...
    +-- /myWebsite        | www.my-website.com
    |   +-- /static
    |   +-- index.js
    |   +-- ...
    +-- ...               | ...

Each index.js should be started as an individual process with its cwd set to its parent-folder (someProject, anotherProject, etc.).

Think ov vHosts. Each project starts a webserver which listens on its own domain. And there's the problem. Only one script can start since, they all try to bind to port 80. I digged to into the node.js API and looked for a possible solution: child_process.fork().

Sadly this doesn't work very well. When I try to send a server instance to the master process (to emit a request later on) or an object consiting of request and response from the master to the salve I get errors. This is because node.js internally tries to convert these advanced objects to a JSON string and then reconverts it to its original form. This makes all the objects loose their reference and functionality.

Seccond approach child.js

var http = require("http");

var server = http.createServer(function(req, res) {
    // stuff...
});
server.listen(80);

process.send(server); // Nope

First approach master.js

var http = require("http"),
    cp = require("child_process");

var child = cp.fork("/home/node/someProject/index.js", [], { env: "/home/node/someProject" });

var router = http.createServer(function(req, res) {
    // domaincheck, etc...
    child.send({ request: req, response: res }); // Nope
});
router.listen(80);

So this is a dead end. But, hey! Node.js offers some kind of handles, which are sendable. Here's an example from the documentation:

master.js

var server = require('net').createServer();
var child = require('child_process').fork(__dirname + '/child.js');
// Open up the server object and send the handle.
server.listen(1337, function() {
  child.send({ server: true }, server._handle);
});

child.js

process.on('message', function(m, serverHandle) {
  if (serverHandle) {
    var server = require('net').createServer();
    server.listen(serverHandle);
  }
});

Here the child directly listens to the master's server. So there is no domaincheck inbetween. So here's a dead end to.

I also thought about Cluster, but this uses the same technology as the handle and therefore has the same limitations.

So... are there any good ideas?

What I currently do is rather hack-ish. I've made a package called distroy. It binds to port 80 and internally proxies all requests to Unix domain socket paths like /tmp/distroy/http/www.example.com, on which the seperate apps listen. This also (kinda) works for HTTPS (see my question on SNI). The remaining problem is, that the original IP address is lost, as it's now always 127.0.0.1. I think I can circumvent this by monkeypatching the net.Server so that I can transmit the IP address before opening the connection.

Community
  • 1
  • 1
buschtoens
  • 8,091
  • 9
  • 42
  • 60
  • 1
    If somebody comes up with a "better fitting" solution for this problem, I'll mark his or her answer as the correct one. – buschtoens May 28 '12 at 23:25
  • I'm currently working on something which should solve this problem... – buschtoens Mar 19 '13 at 15:30
  • Anybody know how to do this while running node within IIS? – Umar Farooq Khawaja Mar 26 '13 at 10:48
  • 1
    That would depend on the implementation. Open an new question and link it here. Maybe we can tackle that out. ;) – buschtoens Mar 26 '13 at 15:14
  • Turns out, I was wrong. IIS supports that out of the box. The only thing one has to take care of is to use `process.env.PORT` to start listening for HTTP connections. If you use a specific port, like port 80, then any other site not using a hostname binding will also respond to that website. – Umar Farooq Khawaja Mar 28 '13 at 03:11
  • `cluster` works very well for what you're trying to do. – awhie29urh2 Jul 24 '13 at 23:16
  • Maybe I haven't made myself clear enough: I want to run *different* apps on the same port. – buschtoens Jul 25 '13 at 12:51
  • It sounds like what you're really looking for is how to build a multi-tenant site with vhost support using node.js. This is one of the basic use cases that popular web frameworks for node.js make easy. There are [many pre-built modules](https://github.com/joyent/node/wiki/modules#wiki-web-frameworks-routers) that do this for you. If you prefer to write it yourself simply observe the incoming HTTP `host` header and write the logic to route requests. `cluster` lets multiple processes listen to the same port. `express` lets you do vhost routing. – awhie29urh2 Jul 25 '13 at 15:24
  • You may also want to rename the question. **"Sharing one port among multiple node.js HTTP processes"** suggests that you want multiple processes listening for HTTP requests on the same port. This is the wrong terminology to use if you're actually asking about how to use vhosts in node.js. – awhie29urh2 Jul 25 '13 at 15:40

3 Answers3

9

If you are interested in a node.js solution check out bouncy, a websocket and https-capable http router proxy/load balancer in node.js.

Define your routes.json like

 {
      "beep.example.com" : 8000,
      "boop.example.com" : 8001
 }

and then run bouncy using

 bouncy routes.json 80
almypal
  • 6,612
  • 4
  • 25
  • 25
  • bouncy is almost the same as [http-proxy](https://github.com/nodejitsu/node-http-proxy). What annoys me about this kind of solution is the fact, that the project server binds to another port, which it actually doesn't use since it should bind to port 80. – buschtoens May 29 '12 at 01:55
1

Personally, I'd just have them all listen on dedicated ports or preferably sockets and then stick everything behind either a dedicated router script or nginx. It's the simplest approach IMO.

Chuck
  • 234,037
  • 30
  • 302
  • 389
  • I just like the idea of having a lightweight node.js-script that internally rewrites the request. But sadly this is impossible at the moment and I have to stick with the unix sockets. – buschtoens May 28 '12 at 23:23
  • 1
    I'm really confused why my answer goes 0/2 when almypal's recommendation is exactly the same (just leave everything on different ports and use a separate router) and goes 6/0. I don't really care on a deep level, but it's curious. – Chuck Oct 10 '13 at 20:14
0

For connect middleware there is vhost extension. Maybe you could copy some of their concepts.

TheHippo
  • 61,720
  • 15
  • 75
  • 100
  • Connect's `vhost` bascially does this: `server.emit('request', req, res);`. I already tried that. You'll need to get the server instance of the child process to be able to emit events on it. Since node.js internally stringifies this instance all reference gets lost and I cant emit events there. – buschtoens May 29 '12 at 11:24