13

My end goal is to make node.js more cost effective per each server instance.

I'm not running a game or chat room but rather simple websites for customers. I would like to house multiple clients on a single server yet have multiple websites running off of port 80 using host header mapping. I would like to still use express as I'm doing but have it be more like a routing thing from port 80 to the other node apps if that is even possible. Node can be cheaper if its done in this way but currently its more expensive for my purposes as each customer would need their own box if running on port 80. Also, my motivation is to focus on node development but there must be a reason to do so in terms of cost.

I do this quite a lot for ASP.NET in Windows as IIS supports this out of the box and I know this is something normal for Apache as well.

Feel free to move this to another forum in stack exchange if this is not the right question or give constructive criticism rather than a random downvote. Thanks.

update

The approach I took was to use static hosting (via gatspy and s3) then an API instead that registered domains through post message from the client and API keys from the server and generates static sites periodically as sites change but thanks for all the suggestions!

King Friday
  • 25,132
  • 12
  • 90
  • 84
  • 1
    From your question it sounds like you already know the basic answer - use `req.headers.host` to decide between different routing chains for different vhosts - so are you asking for a built-in way to do it? Or a library to abstract the details? – jimw Apr 18 '12 at 19:00
  • built in would be preferable, node is great in that you can roll your own but I would prefer something actively being built in its own right – King Friday Apr 18 '12 at 19:05
  • 1
    There's [this](https://github.com/coolaj86/connect-vhoster), but it's not a very active project. Otherwise [node-http-proxy](https://github.com/nodejitsu/node-http-proxy), which is much more active but does perhaps much more than you need. – jimw Apr 18 '12 at 19:12

2 Answers2

26

In theory, you could build a pure-Node web server that emulated the functionality of Apache/Lighttpd/Nginx, but I wouldn't recommend it. In fact, for serious production services, I'd recommend ALWAYS fronting your service with Nginx or an equivalent (see this and this).

Here's how a simple Nginx config would work for two subservices exposed on port 80.

worker_processes  4;

events {
  worker_connections 1024;
}

http {
  include       mime.types;
  default_type  text/html;

  server {
    listen 80;
    server_name service1.mydomain.com
    location / {
      proxy_pass         http://127.0.0.1:3000/;
    }
  }
  server {
    listen 80;
    server_name service2.mydomain.com
    location / {
      proxy_pass         http://127.0.0.1:3001/;
    }
  }
}

I've seen production boxes kernel panic because Node doesn't throttle load by default and was prioritizing accepting new connections over handling existing requests - granted, it "shouldn't" have crashed the kernel, but it did. Also, by running on port 3000, you can run your Node service as non-root with very few permissions (and still proxy it so that it appears to be on port 80). You can also spread load between multiple workers, serve statics, log requests, rewrite urls, etc, etc, etc. Nginx is very fast (much lighter than Apache). The overhead of same-box proxy forwarding is minimal and buys you so much functionality and robustness that it's a slam dunk in my book. Even minor stuff, like - when I crash or overload my node service, do user get a black hole, or a "pardon our dust, our servers are being maintained" splash.

Community
  • 1
  • 1
Dave Dopson
  • 41,600
  • 19
  • 95
  • 85
  • 1
    I saw this answer as well. http://stackoverflow.com/questions/5009324/node-js-nginx-and-now I'll check yours off as I agree this is the best approach I've seen. Thanks. – King Friday May 14 '12 at 01:35
  • Just as a question: This isn't really multitenancy as the there is more than one software running but it is about having several websites use the same port on one server right? – EDREP Apr 19 '17 at 14:00
5

What about using a proper reverse proxy, like HAProxy, have the proxy listen on port 80, and delegate to multiple node instances on non public ports (e.g. 10000, 10001 etc.), based on headers.host?

UpTheCreek
  • 31,444
  • 34
  • 152
  • 221
  • The main problem with this is I wouldn't have access at this level for the HAProxy to work. I'm looking for a Node.js solution directly but this is useful for sure. I put this in my mind for future ideas. – King Friday Apr 18 '12 at 20:46
  • 3
    Personally I wouldn't trust a node only solution - especially since you are talking about different customer sites. Node is single threaded - meaning the sites will easily interfere with each others performance and/or take each other down when they die (if they are all running in the same instance). How are you currently deploying node? – UpTheCreek Apr 18 '12 at 20:51
  • Nginx would do just as well, by the way ... also HAProxy has a few SSL limitations :) – errordeveloper Apr 18 '12 at 21:04
  • @kitgui.com, I don't get what do you mean by `access at this level` - if you are running node as user who can listen on port 80, you should be able to run anything else :) – errordeveloper Apr 18 '12 at 21:05
  • @errordeveloper the link showed a hardware solution which is cool for other things I could do. What I was interested in specifically is a node.js pure solution but I appreciate the heads up. Good stuff. – King Friday Apr 19 '12 at 18:42
  • 1
    @kitgui.com - The link isn't to a hardware solution - HAProxy is just a piece of software, same as node. If you can install node, then you can install HA. – UpTheCreek Apr 19 '12 at 19:20
  • @errordeveloper didn't see your other comment. Just replying. Currently deploying it as stand alone but its not cost effective that way so am not doing any in production, only for experimental. My motivation is to move from Windows hosting and into Linux with Node as my dev team and I love programming in JavaScript. Its basically a way to save lots of money if possible. I would agree with you on the threading issue but the clients I would put on shared are not big enough to break anyone. Bigger clients would get their own box as we normally do on Windows. – King Friday Apr 19 '12 at 23:39