I have a PHP script which does a loop (several thousand times). Each loop requests a paginated URL which returns some results in JSON. Each result requires at least 2 further API hits to request more information. Once all this is complete, I'd build this into a packet for indexing/storing.
In PHP this is quite easy as it's naturally synchronous.
I'm in the process of moving this to a NodeJS setup and finding it difficult to elegantly do this in NodeJS.
If I make a for-loop, the initial HTTP request (which is often promise based) would complete almost instantly, leaving the resolution of the request to the promise handlers. This means all several-thousand page requests would end up being fired pretty much in parallel. It would also mean I'd have to chain up the sub-requests within the promise resolution (which leads to further promise chains as I have to wait for those to be resolves).
I've tried using the async/await approach, but they dont seem to play nicely with for-loops (or forEach, at least).
This is roughly what I'm working with right now, but I'm starting to think its entirely wrong and I could be using callbacks to trigger the next loop?!
async processResult(result) {
const packet = {
id: result.id,
title: result.title,
};
const subResponse1 = await getSubThing1();
packet.thing1 = subResponse1.body.thing1;
const subResponse2 = await getSubThing2();
packet.thing2 = subResponse2.body.thing2;
}
(async () => {
for (let page = START_PAGE; page < MAX_PAGES; page += 1) {
console.log(`Processing page ${page}...`);
getList(page)
.then(response => Promise.all(response.body.list.map(processResult)))
.then(data => console.log('DATA DONE', data));
console.log(`Page ${page} done!`);
}
})();