0

I have a list of promises and currently I am using promiseAll to resolve them

Here is my code for now:

   const pageFutures = myQuery.pages.map(async (pageNumber: number) => {
      const urlObject: any = await this._service.getResultURL(searchRecord.details.id, authorization, pageNumber);
      if (!urlObject.url) {
        // throw error
      }
      const data = await rp.get({
        gzip: true,
        headers: {
          "Accept-Encoding": "gzip,deflate",
        },
        json: true,
        uri: `${urlObject.url}`,
      })

      const objects = data.objects.filter((object: any) => object.type === "observed-data" && object.created);
      return new Promise((resolve, reject) => {
        this._resultsDatastore.bulkInsert(
          databaseName,
          objects
        ).then(succ => {
          resolve(succ)
        }, err => {
          reject(err)
        })
      })
    })
    const all: any = await Promise.all(pageFutures).catch(e => {
       console.log(e)
     }) 

So as you see here I use promise all and it works:

const all: any = await Promise.all(pageFutures).catch(e => {
   console.log(e)
 }) 

However I noticed it affects the database performance wise so I decided to resolve every 3 of them at a time. for that I was thinking of different ways like cwait, async pool or wrting my own iterator but I get confused on how to do that?

For example when I use cwait:

let promiseQueue = new TaskQueue(Promise,3);
const all=new Promise.map(pageFutures, promiseQueue.wrap(()=>{}));

I do not know what to pass inside the wrap so I pass ()=>{} for now plus I get

Property 'map' does not exist on type 'PromiseConstructor'. 

So whatever way I can get it working(my own iterator or any library) I am ok with as far as I have a good understanding of it. I appreciate if anyone can shed light on that and help me to get out of this confusion?

Learner
  • 1,686
  • 4
  • 19
  • 38
  • 1
    I am not sure what you are trying to achieve.Promise does not have Map method.. I think you store all values in pageFutures array.. So pageFutures.Map would be correct... – kumaran Dec 24 '19 at 16:02
  • @kumaran I was trying to follwo cwait library instruction but it is so confusing. All I need is to execute promises in pageFutures step by step. so 3 promises in pageFutures should be resolved first then the next 3 ... – Learner Dec 24 '19 at 16:06
  • @kumaran if I follow what you said with cwait I get this: const all=new pageFutures.map( promiseQueue.wrap(()=>{})); which is not a valid option – Learner Dec 24 '19 at 16:11
  • What is the actual question here? I can't seem to really tell. Can you please clarify what exactly you want help with? – jfriend00 Dec 24 '19 at 18:31
  • Avoid the [`Promise` constructor antipattern](https://stackoverflow.com/q/23803743/1048572?What-is-the-promise-construction-antipattern-and-how-to-avoid-it)! – Bergi Dec 24 '19 at 20:20

2 Answers2

1

First, you asked a question about a failing solution attempt. That is called X/Y problem.

So in fact, as I understand your question, you want to delay some DB request.

You don't want to delay the resolving of a Promise created by a DB request... Like No! Don't try that! The promise wil resolve when the DB will return a result. It's a bad idea to interfere with that process.

I banged my head a while with the library you tried... But I could not do anything to solve your issue with it. So I came with the idea of just looping the data and setting some timeouts.

I made a runnable demo here: Delaying DB request in small batch

Here is the code. Notice that I simulated some data and a DB request. You will have to adapt it. You also will have to adjust the timeout delay. A full second certainly is too long.

// That part is to simulate some data you would like to save.

// Let's make it a random amount for fun.
let howMuch = Math.ceil(Math.random()*20)

// A fake data array...
let someData = []
for(let i=0; i<howMuch; i++){
  someData.push("Data #"+i)
}

console.log("Some feak data")
console.log(someData)
console.log("")

// So we have some data that look real. (lol)
// We want to save it by small group

// And that is to simulate your DB request.
let saveToDB = (data, dataIterator) => {
  console.log("Requesting DB...")
  return new Promise(function(resolve, reject) {
    resolve("Request #"+dataIterator+" complete.");
  })
}

// Ok, we have everything. Let's proceed!
let batchSize = 3 // The amount of request to do at once.
let delay = 1000  // The delay between each batch.

// Loop through all the data you have.
for(let i=0;i<someData.length;i++){

  if(i%batchSize == 0){
    console.log("Splitting in batch...")

    // Process a batch on one timeout.
    let timeout = setTimeout(() => {

      // An empty line to clarify the console.
      console.log("")

      // Grouping the request by the "batchSize" or less if we're almost done.
      for(let j=0;j<batchSize;j++){

        // If there still is data to process.
        if(i+j < someData.length){

          // Your real database request goes here.
          saveToDB(someData[i+j], i+j).then(result=>{
            console.log(result)
            // Do something with the result.
            // ...
          })

        } // END if there is still data.

      } // END sending requests for that batch.

    },delay*i)  // Timeout delay.

  } // END splitting in batch.

} // END for each data.
Louys Patrice Bessette
  • 33,375
  • 6
  • 36
  • 64
1

First some remarks:

  • Indeed, in your current setup, the database may have to process several bulk inserts concurrently. But that concurrency is not caused by using Promise.all. Even if you had left out Promise.all from your code, it would still have that behaviour. That is because the promises were already created, and so the database requests will be executed any way.

  • Not related to your issue, but don't use the promise constructor antipattern: there is no need to create a promise with new Promise when you already have a promise in your hands: bulkInsert() returns a promise, so return that one.

As your concern is about the database load, I would limit the work initiated by the pageFutures promises to the non-database aspects: they don't have to wait for eachother's resolution, so that code can stay like it was.

Let those promises resolve with what you currently store in objects: the data you want to have inserted. Then concatenate all those arrays together to one big array, and feed that to one database bulkInsert() call.

Here is how that could look:

const pageFutures = myQuery.pages.map(async (pageNumber: number) => {
  const urlObject: any = await this._service.getResultURL(searchRecord.details.id, 
                                                         authorization, pageNumber);
  if (!urlObject.url) { // throw error }
  const data = await rp.get({
    gzip: true,
    headers: { "Accept-Encoding": "gzip,deflate" },
    json: true,
    uri: `${urlObject.url}`,
  });
  // Return here, don't access the database yet...
  return data.objects.filter((object: any) => object.type === "observed-data" 
                                           && object.created);
});
const all: any = await Promise.all(pageFutures).catch(e => {
  console.log(e);
  return []; // in case of error, still return an array
}).flat(); // flatten it, so all data chunks are concatenated in one long array
// Don't create a new Promise with `new`, only to wrap an other promise. 
//   It is an antipattern. Use the promise returned by `bulkInsert`
return this._resultsDatastore.bulkInsert(databaseName, objects);

This uses .flat() which is rather new. In case you have no support for it, look at the alternatives provided on mdn.

trincot
  • 317,000
  • 35
  • 244
  • 286