0

I'm testing a NodeJS app that serves media files stored in memory. The media files are approximately 2MB-5MB each in size. I'm trying to figure out what's the best way to max out available Ethernet channel (1 Gbps or 10 Gbps).

I'm testing in VM (VirtualBox) with Ubuntu 16.04.1 LTS. For testing I'm using my own NodeJS script that simply makes multiple outgoing requests to the server and then logs to console average bitrate. The test NodeJS script simply runs for 1 minute and there is a configurable parameter N that indicates how many simultaneous downloads it can have at once.

What I noticed is that if I increase number of simultaneous downloads then average throughput goes down considerably. For example:

  • if I tun my test with N = 30 (30 simultaneous downloads) then I get 125 MB/s overall, in this case each of these requests are served at 3-5 MB/s
  • now, if I run it with N = 300 then I get overall bitrate of 90 MB/s or 30% lower
  • if I run with N = 600 then I get overall bitrate of 80 MB/s.

Any idea why it doesn't seem to scale well for higher number of simultaneous downloads? Higher number of simultaneous connections almost doesn't have any CPU impact, If I try to run same test script against nginx serving same files from SSD then I get identical numbers. CPU load on Ubuntu with NodeJS server doesn't go higher than 20%. If I run my benchmark locally on Ubuntu then I get 1800 MB/s throughput.

I understand that benchmarking virtual Ethernet card is quite useless, yet I can max out download rate with 30 simultaneous connections, but if I use 300 simultaneous connection overall throughput goes down by 30%, whereas I'd expect that it shouldn't go down by more than 5%.

What can be done to increase overall bitrate with many simultaneous downloads?

P.S. There is another interesting post about maximum number of TCP/IP connections.

Community
  • 1
  • 1
Pavel P
  • 15,789
  • 11
  • 79
  • 128

0 Answers0