0

I would like to make 10,000 concurrent HTTP requests. I am currently doing it by using Promise.all. However, I seem to be rate limited in some way, it takes around 15-30 mins to complete all 10,000 requests. Is there something in axios or in the http requests in node that is limiting me? How can I raise the limt if there is one?

const axios = require('axios');

function http_request(url) {
    return new Promise(async (resolve) => {
        await axios.get(url);
        // -- DO STUFF
        resolve();
    });
}

async function many_requests(num_requests) {
    let all_promises = [];
    for (let i = 0; i < num_requests; i++) {
        let url = 'https://someurl.com/' + i;
        let promise = http_request(url);
        all_promises.push(promise);
    }
    return Promise.all(all_promises);
}

async function run() {
    await many_requests(10000);
}

run();
Anters Bear
  • 1,816
  • 1
  • 15
  • 41
  • 2
    Well first of all, javascript is singlethreaded. So even if I don't know what's happening in `// -- DO STUFF`, that's for sure executed one after the other. And of course there is some overhead for handling 10000 active promises in the event queue. Furthermore the OS may have some limits in open ports. And also the backend may have some request limit ... – derpirscher Jan 23 '22 at 13:43
  • Hi thanks for your reply. Do stuff is just reformatting the data and saving into an object locally so nothing computationally heavy. I get what you're saying about single threaded JS, so there's no real solution for this apart from for example having worker scripts and forking them as child processes that each process like 100 requests? – Anters Bear Jan 23 '22 at 13:46
  • Have you tried simplifying your code to help get to the root cause? Remove "do stuff" entirely. Then make sure you're making a very simple api call to an endpoint that allows you to hit it 10,000 times in quick succession. Do you have such an endpoint? – KayakinKoder Jan 23 '22 at 13:53
  • 10.000 is a lot. Multiply 10k vs 10ms and you get 1m40s just of latency for your http calls. 10ms is a good latency, depending on distance you have 60-100ms latency. – Elias Soares Jan 23 '22 at 14:17
  • Does this answer your question? [Lots of parallel http requests in node.js](https://stackoverflow.com/questions/17372394/lots-of-parallel-http-requests-in-node-js) – Martin Zeitler Jan 24 '22 at 00:45

1 Answers1

0

In Node.js there are two types of threads: one Event Loop (aka the main loop, main thread, event thread, etc.), and a pool of k Workers in a Worker Pool (aka the threadpool).

...

The Worker Pool of Node.js is implemented in libuv (docs), which exposes a general task submission API.

Event loop run in a thread, push tasks to pool of k Workers. And these workers will run parallel. Default number of work in pool is 4. You can set more.

source

libuv

Default UV_THREADPOOLSIZE is 4. You can set UV_THREADPOOLSIZE as link. Limit of it depend on os, you need check your os:

set UV_THREADPOOL_SIZE

Viettel Solutions
  • 1,519
  • 11
  • 22