3

I have an android game that has 40,000 users online. And each user send request to server every 5 second.

I write this code for test request:

const express = require('express')  
const app = express()
const pg = require('pg')  
const conString = 'postgres://postgres:123456@localhost/dbtest'      

 app.get('/', function (req, res, next) { 

  pg.connect(conString, function (err, client, done) {
    if (err) {

      return next(err)
    }
    client.query('SELECT name, age FROM users limit 1;', [], function (err, result) {
      done()

      if (err) {

        return next(err)
      }

      res.json(result.rows)
    })
  })
})
app.listen(3000)

Demo

And for test this code with 40,000 requests I write this ajax code:

for (var i = 0; i < 40000; i++) {
    var j = 1;
    $.ajax({
        url: "http://85.185.161.139:3001/",

        success: function(reponse) {
            var d = new Date();
            console.log(j++, d.getHours() + ":" + d.getMinutes() + ":" + d.getSeconds());
        }
    });
}

SERVER detail(I know this is poor) enter image description here

Questions:

  1. this code (node js)only response 200 requests per second!

  2. how can improve my code for increase number response per second?

  3. this way(ajax) for simulate 40,000 online users is correct or not?

  4. if i use socket is better or not?

G07cha
  • 4,009
  • 2
  • 22
  • 39
  • You're starting 40000 _concurrent_ AJAX requests, in a browser. My guess is that your browser is going to be the bottleneck here... – robertklep Aug 27 '16 at 07:46
  • @robertklep to be more precise, maximum connections opened by modern browsers to one server [is 6](http://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser) so it basically there will be only 6 requests at same time to server. – G07cha Aug 27 '16 at 08:46

1 Answers1

2

You should use Divide&Conquer algorithm for solving such problems. Find the most resource inefficient operation and try to replace or reduce an amount of calls to it.

The main problem that I see here is that server open new connection to database on each request which possibly takes most of the time and resources.

I suggest to open connection when the server boots up and reuse it in requests.

const express = require('express')  
const app = express()
const pg = require('pg')  
const conString = 'postgres://postgres:123456@localhost/dbtest'      
const pgClient

pg.connect(conString, function (err, client, done) {
    if (err) {
        throw err
    }
    pgClient = client
})

app.get('/', function (req, res, next) { 
        pgClient.query('SELECT name, age FROM users limit 1;', [], function (err, result) {
            if (err) {
                return next(err)
            }

            res.json(result.rows)
        })
})
app.listen(3000)

For proper stress load testing better use specialized utilities such as ab from Apache. Finally, sockets are better for rapid, small data transfer but remember it has problems with scaling and in most cases became very inefficient at 10K+ simultaneous connections.

EDIT: As @robertklep pointed out, better use client pooling in this case, and retrieve client from pool.

G07cha
  • 4,009
  • 2
  • 22
  • 39