There's a library called p-limit that is built for this purpose, but it's written in ESM, so it's a hassle to deal with. I figured, how hard could it be to implement my own? So I came up with this implementation:
(async() => {
const promisedAxiosPosts = _.range(0, 100).map(async(item, index) => {
console.log(`${index}: starting`);
return Promise.resolve();
});
let i = 0;
for (const promisedAxiosPostGroup of _.chunk(promisedAxiosPosts, 10)) {
console.log(`***********************
GROUP ${i}
SIZE ${promisedAxiosPostGroup.length}
***********************`);
await Promise.all(promisedAxiosPostGroup);
i++;
}
}
)().catch((e) => {
throw e;
})
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.21/lodash.min.js" integrity="sha512-WFN04846sdKMIP5LKNphMaWzU7YpMyCU245etK3g/2ARYbPK9Ub18eG+ljU96qKRCWh+quCY7yefSmlkQw1ANQ==" crossorigin="anonymous" referrerpolicy="no-referrer"></script>
Why isn't this waiting for each chunk to complete before moving on to the next one?
I think that map
could be the culprit, but I don't see how: it returns a Promise<void>[]
; if it's await
ing on the functions, wouldn't it be returning a void[]
(not sure if that's even a thing)?