There are a variety of strategies for dealing with too many requests and which strategy gets you the best throughput without running afoul of the target server rate limiting depends entirely upon exactly how the target server is measuring and enforcing things. Unless that is documented, you would have to just experiment. The safest (and potentially slowest) strategy is to run your requests sequentially with a delay between them and tune that delay time as appropriate.
Run Sequentially With Delay Between Each Request
You can run your requests sequentially and use await on a delay promise to separate them in time.
const axios = require('axios');
function delay(t) {
return new Promise(resolve => setTimeout(resolve, t));
}
async function getResults() {
const results = [];
const strings = ['a', 'b', 'c'];
for (let str of strings) {
await delay(1000);
let data = await axios.get(`https://www.apiexample.com/get/?cfg=json&value=${str}`);
results.push(data);
}
return results;
}
getResults().then(results => {
console.log(results);
}).catch(err => {
console.log(err);
});
Run N Requests at a Time where N > 1 and N < all your Requests
If you want to run N requests at a time where N is more than 1 (often like 3 or 4) but less than all your requests, then see mapConcurrent()
in this answer. Whether this is feasible not as many as you were doing depends entirely upon the target server and what exactly it is measuring and enforcing.
Actual Rate Limiting where You Run N Requests per Second
For actual rate limiting where you control the requests per second directly, then see rateLimitMap()
in this answer: Choose proper async method for batch processing for max requests/sec.