0

The situation is that about 50.000 electronic devices are going to connect to a webservice created in node.js once per minute. Each one is going to send a POST request containg some JSON data. All this data should be secured. The web service is going to receive those requests, saving the data to a database. Also reading requests are possible to get some data from the DB.

I think to build up a system based on the following infrastructure:

Node.js + memcached + (mysql cluster OR Couchbase)

So, what memory requirements do I need to assign to my web server to be able to handle all this connections? Suppose that in the pessimistic possibility I would have 50.000 concurrent requests.

And what if I use SSL to secure the connections? Do I add too much overhead per connection? Should I scale the system to handle them?

What do you suggest me?

Many thanks in advance!

skatz
  • 125
  • 3
  • 13

2 Answers2

2

Of course, it is impossible to provide any valuable calculations, since it is always very specific. I would recommend you just to develop scalable and expandable system architecture from the very beginning.
And use JMeter https://jmeter.apache.org/ for load testing. Then you will be able to scale from 1000s to unlimited connections.
Here is a 1 000 000 connections article http://www.slideshare.net/sh1mmer/a-million-connections-and-beyond-nodejs-at-scale

Sergey Yarotskiy
  • 517
  • 4
  • 14
-3

Remember that your nodejs application will be single threaded. Meaning your performance will degrade horribly when you increase the number of concurrent requests.

What you can do to increase your performance is create a node process for each core that you have on your machine all of them behind a proxy (say nginx), and you can also use multiple machines for your app.

If you make requests only to memcache then your api won't degrade. But once you start querying mysql it will start throttling your other requests.

Edit: As suggested in the comments you could also use clusters to fork worker processes and let them compete amongst each other for incoming requests. (Workers will run on a separate thread, thereby allowing you to use all cores). Node.js on multi-core machines

Community
  • 1
  • 1
Ali
  • 473
  • 2
  • 7
  • Sorry, -1 because this is not the right way to deploy with Node. To use more processes you should be using the [cluster core module](http://nodejs.org/api/cluster.html), which allows you to launch worker process that may all share a socket or port. i.e. no reverse proxy is necessary. – qubyte Feb 14 '14 at 21:41
  • Although Ali is right about node being single threaded, there is still hope. Please note: JXcore utilizes mulithreading by increasing the [libuv threadpool](http://www.future-processing.pl/blog/on-problems-with-threads-in-node-js/). [Web workers](https://github.com/nodejs/node/pull/2133) are also on there way to the core. Also, mysql will only start to throttle your event loop if you don't have it configured correctly with your server (bad conf settings) or not using pooling. I recommend felixge's driver. :) – NiCk Newman Sep 18 '15 at 16:14