1

I was trying to perform nearly 2000+ simultaneous http get request to some web api ( abc.com/query?val=somekey).below is my code.

async.each(keysArray,function(key,callback1){
  sails.http.get({
    hostname:'abc.com',
    path:'/query?val='+key,
    headers:{
      "Accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
      "User-Agent":"MYBROWSER"
    }
  },function(response){
    var str='';
    response.on('data',function(chunk){
      str+=chunk;
    });
    response.on('end',function(){
      console.log(new Buffer(str,'utf8'));
    //  some job on each str from each key
    });
  });
});

length of keysArray is around 2000.So the above code is performing nearly 2000 http get request.But i was getting error like

events.js:141
      throw er; // Unhandled 'error' event
      ^

Error: read ECONNRESET
    at exports._errnoException (util.js:870:11)
    at TCP.onread (net.js:544:26)

Though i found a way to limit concurrent using async.eachLimits(). Is it possible to perform that much requests or any kind of dependency of machine or server is there.

vkstack
  • 1,582
  • 11
  • 24
  • See https://stackoverflow.com/questions/17245881/node-js-econnreset for an explanation of the error. My guess is you're definitely hitting the web server too fast...try throttling your http requests to a more reasonable number. – Tennyson H Feb 09 '16 at 20:14
  • Is it because you're getting rate limited by the website? 1000 simo seems pretty crazy. – Dave Chen Feb 09 '16 at 20:14
  • @TennysonH thank you for the resource. But pushing 1000 task in async.parallel() is causing me same error.Because this huge numbe of parallel http.get request! – vkstack Feb 09 '16 at 20:20
  • @waza007 scale it down...20 or 50 a second? chances are ABC.com is pissed at you for hammering their server and are dropping your connections when you try to flood it with 2000 requests at once. – Tennyson H Feb 09 '16 at 20:22
  • @DaveChen is it possible to hit the api that many times if the website is capable enough to serve 1000's request at a time.Or it will still cause same issue! – vkstack Feb 09 '16 at 20:23
  • Depends on the site/api. It very well could be throttling on a per-user basis. – Kevin B Feb 09 '16 at 20:24
  • @TennysonH Thank i tried limitin grequest to 10,25,50,100 and it worked.I am just curious whether it 's possible if i don't limit it! – vkstack Feb 09 '16 at 20:26
  • 1
    @waza007 it is probably not possible on a single machine. you would need to stagger it over a few different machines on a professional grade network to hit 2000 requests at once. – Tennyson H Feb 09 '16 at 20:28

1 Answers1

0

well, the answer is really simple,

for-loop

Here's the logic, node.js uses/based-on JavaScript and so it is single threaded (one function at one time.) You cannot perform 1000 concurrent requests "from one single machine" BUT yeah you can make 1000+ requests using for loop

for( int i = 0 ; i < 1000 ; i++)
{
// your request code here
}

and Remember to wrap your logic inside an async function, then you can await for the promise to resolve as you are looking for async/await.

const request = require('request-promise')
async function foo (a) {
   for (int i = 0 ; i < 1000 ; i++)
      try {
         let a = await request('localhost:8080/')
         // a contains your response data.
      } catch (e) {
         console.error(e)
      }  
}
ADITYA AHLAWAT
  • 140
  • 1
  • 13