0

I'll explain my problem shortly. I am using NodeJS to retrieve response from a url and I have to make multiple request to that URL.

I got only one limit: I can't make more than 10 request for each minute. How can I handle that problem?

I tried to follow that stack overflow response too: Make several requests to an API that can only handle 20 request a minute

but that method its not working because its not awaiting all promise and its giving undefined result

Actually I am using that code but its not awaiting all response, its giving me undefined directly and later all requests are done:

async function rateLimitedRequests(array, chunkSize) {
var delay = 3000 * chunkSize;
var remaining = array.length;
var promises = [];

var addPromises = async function(newPromises) {
    Array.prototype.push.apply(promises, newPromises);
    if (remaining -= newPromises.length == 0) {
        await Promise.all(promises).then((data) => {
            console.log(data);
        });
    }
};

(async function request() {
    addPromises(array.splice(0, chunkSize).map(apiFetch));
    if (array.length) {
        setTimeout(request, delay);
    }
})();

}

Michele
  • 23
  • 5

1 Answers1

1

You can do by implementing Promise.all method.

For example:

const [ response1, response2, response3 ] = await Promise.all([ axios.get(URL_HERE), axios.get(URL_HERE), axios.get(URL_HERE) ];

console.log(response1);

console.log(response3);

console.log(response2);

For Scenario 2 which we discuss below in comment section

function rateLimiter(array = [], chunkSize = 10 ) {

    const delay = 10 * 1000;

    return new Promise((resolve, reject) => {
        let results = [];
        const callback = (iteration = 0) => {
            const process = array.slice(iteration * chunkSize, chunkSize);
            if(!process.length) return resolve(results);

            Promise.all(process).then(responses => {
                results = [ ...results, ...responses ];
                const processLength = array.slice((iteration + 1) * chunkSize, chunkSize).length;
                
                if(!processLength) return resolve(results);
                setTimeout(() => {
                    callback(iteration + 1);
                }, delay)
            })
        }

        callback();

    })
}

let results = await rateLimiter([ axios.get('URL_HERE'), axios.get('URL_HERE') ], 20);
Zain
  • 98
  • 4
  • Yes, I know that I can do in that way but I am trying to find a clean algorithm where you pass seconds and all urls and it wait until the end of operations – Michele May 30 '22 at 17:53
  • To make the requirement more clear, are you looking for cron job functionality? or you have any parameters coming out from the database or from any source which you have to iterate after every 1 minutes once exceeding the calling limit and after you want to end the algorithm? – Zain May 30 '22 at 18:07
  • Yes the second part, the problem and the code is explained up. I've updated the thread – Michele May 30 '22 at 18:07
  • I have updated my question please check, I haven't compile this algorithm on my side but I am confident that this will resolve your problem, if you find any issue in the code or any confusion please let me know. – Zain May 30 '22 at 18:31
  • Its near what I need but its not what I need because the focus of the algorithm is to make X request but I need like 1 request each 3 seconds until the end of array of request (the axios one that you've done) or 20 request directly in 1 minute. I need timeout each all request because I've got limit from API and the code that you've posted is making all request in 1 second without any limit from one request to another one. Thanks for your time rn <3 – Michele May 30 '22 at 18:36