1

(My target is clarify my concept about the problem, not code)

I want execute a array of promises sequentially, but nodeJS throw a strange error about many promises executed in parallel.(Because i limited that array to 20 promises and works, 50 promises and works, but 9000 promises and explode..)

  • I know that we have some solutions like, array.reduce(), loops, etc
  • I know about the promises states (my array have pending promises initially)

My question: I can execute 20 promises, then another 20 promises, etc but... If im executing my promises sequentially, nodeJS must execute 9k promises without problem? I have a bad concept? My code is wrong?

(Im doubting because nodeJS wait some time before begin to resolve the promises)

My case: i trying download 9k+ images (with axios), then save each one and then wait 5 seconds sequentially. [download 1 image, save that image, wait 5 seconds, then download next image, save.., wait..., etc.. ] Possible?

Pingolin
  • 3,161
  • 6
  • 25
  • 40
AndresSp
  • 176
  • 3
  • 16
  • 2
    So first question, why in the world would you want to chain 9000 promises? That is just poor design. Doesn't the api or tool you use offer bulk gets? – basic Jan 03 '19 at 14:45
  • i cannot find any reasons for such an amount of promises – messerbill Jan 03 '19 at 14:49
  • If we talk theoretically, did you try to config node to run with more memory? – Maayao Jan 03 '19 at 14:51
  • Hahah ok, i want download the images and save in a server directory to consult that images from a webservice, i only execute this process like a backup of the database – AndresSp Jan 03 '19 at 14:51
  • and is this your own webservice or an external one? – messerbill Jan 03 '19 at 14:52
  • @Maayao but i need that? if i trying execute 1 promise, then another promise.. etc – AndresSp Jan 03 '19 at 14:53
  • @messerbill the consulted webservice is external and have a limit of 20 request/ second – AndresSp Jan 03 '19 at 14:54
  • maybe this can help you: https://stackoverflow.com/questions/48663080/wait-for-all-different-promise-to-finish-nodejs-async-await/48663158#48663158 – messerbill Jan 03 '19 at 14:55
  • 1
    I think Promise.all only care about if all promises was resolved, and allow execute in parallel, i tried it. Thanks – AndresSp Jan 03 '19 at 15:00
  • thats true but i guess this is what you are searching for. you need to do the first 20 using `all()` method, once finished the next 20 and so on....i only don't know how you get your target urls – messerbill Jan 03 '19 at 15:02
  • oh @messerbill i didnt read your comment, thanks. Ahm i think i will do that. But someone know if NodeJS first load the promises or something before resolve? – AndresSp Jan 03 '19 at 15:09
  • Im curious about the execution sequentially of many promises, i want improve that concept :( Maybe nodeJS load the promises in memory or something? before execute in series? – AndresSp Jan 03 '19 at 15:14
  • a promise is an event handler and node will wait until the `resolve` event will be fired. this will be the case once the async action (in your case the API call) is finished (in case of http request when the status code is returned) – messerbill Jan 03 '19 at 15:16
  • 1
    Take a look at this npm package: https://www.npmjs.com/package/promise-queue – Jaime Jan 03 '19 at 15:24
  • Oh thanks everyone, i will read your comments and answers at night :D – AndresSp Jan 03 '19 at 16:15

1 Answers1

2

I would have used something like a worker pool instead of executing things in a batch of 20 each time, you will always end up waiting for the last one to finish before you start next 20 batch, instead you should set a limit of how many continious download you want to do so you have no more then 20 promises and not a long chain of 9000

The same thing can be accomplish with iterators also. (a same iterator can be passed to different workers and while someone calls the first item the next worker will always get the next one)

So with zero dependencies i would do something like this:

const sleep = n => new Promise(rs => setTimeout(rs, 1000))

async function sequentialDownload(iterator) {
  for (let [index, url] of iterator) {
    // figure out where to save the file
    const path = path.resolve(__dirname, 'images', index + '.jpg')
    // download all images as a stream
    const res = await axios.get(index, { responseType: 'stream' })

    // pipe the stream to disc
    const writer = fs.createWriteStream(path)
    res.data.pipe(writer)

    // wait for the download to complete
    await new Promise(resolve => writer.on('finish', resolve))
    // wait a extra 5 sec
    await sleep(5000)
  }
}

const arr = [url1, url2, url3] // to be downloaded
const workers = new Array(20) // create 20 "workers"
  .fill(arr.entries()) // fill it with same iterator
  .map(sequentialDownload) // start working

Promise.all(workers).then(() => {
  console.log('done downloading everything')
})
Endless
  • 34,080
  • 13
  • 108
  • 131