1

I have a bulk create participants function that using Promise.allSettled to send 100 axios POST request. The backend is Express and frontend is React. That request is call a single add new participant rest API. I have set the backend timeout to 15s using connect-timeout. And frontend is 10s timeout.

My issue is when I click the bulk add button, the bulk create is triggered and that Promise.allSettled concurrent starts. However, I cannot send a new request before all concurrent request done. Because I have set up a timeout on the frontend, the new request will be cancelled.

Is there a way, I can still make the concurrent request, but that request does not stop other new requests?

This is the frontend code, createParticipant is the API request.

const PromiseArr = []
for (let i = 0; i < totalNumber; i++) {
    const participant = participantList[i]
    const participantNewDetail = {
        firstName: participant.firstName,
        lastName: participant.lastName,
        email: participant.email,
    }
    PromiseArr.push(
        createParticipant(participantNewDetail)
            .then((createParticipantResult) => {
                processedTask++
                processMessage = `Processing adding participant`
                dispatch({ type: ACTIVATE_PROCESS_PROCESSING, payload: { processedTask, processMessage } })
            })
            .catch((error) => {
                processedTask++
                processMessage = `Processing adding participant`
                dispatch({ type: ACTIVATE_PROCESS_PROCESSING, payload: { processedTask, processMessage } })
                throw new Error(
                    JSON.stringify({
                        status: "failed",
                        value: error.data.message ? error.data.message : error,
                    })
                )
            })
    )
}
const addParticipantResults = await Promise.allSettled(PromiseArr)

PromiseArr is the Promise array with the length 100.

Is it possible I can splite this big request into small pieces promise array and send to the backend and within each request gap, it's possible I can send another new request like retriveUserDetail?

enter image description here

Moon
  • 790
  • 1
  • 8
  • 19

1 Answers1

2

If you're sending 100 requests at a time to your server, that's just going to take awhile for the server to process. It would be best be to find a way to combine them all into one request or into a very small number of requests. Some server APIs have efficient ways of doing multiple queries in one request.

If you can't do that, then you probably should be sending them 5-10 at a time max so the server isn't being asked to handle sooo many simultaneous requests which causes your additional request to go to the end of the line and take too long to process. That will allow you to send other things and get them processed while you're chunking away on the 100 without waiting for all of them to finish.

If this is being done from a browser, you also have some browser safeguard limitations to deal with where the browser refuses to send more than N requests to the same host at a time. So if you send more than that, it queues them up and holds onto them until some prior requests have completed. This keeps one client from massively overwhelming the server, but also creates this long line of requests that any new request has to go to the end of. The way to deal with that is not never send more than a small number of requests to the same host and then that queue/line will be short when you want to send a new request.

You can look at these snippets of code that let you process an array of data N-at-a-time rather than all at once. Each of these has slightly different control options so you can decide which one fits your problem the best.

mapConcurrent() - Process an array with no more than N requests in flight at the same time

pMap() - Similar to mapConcurrent with more argument checking

rateLimitMap() - Process max of N requestsPerSecond

runN() - Allows you to continue processing upon error

These all replace both Promise.all() and whatever code you had for iterating your data, launching all the requests and collecting the promises into an array. The functions take an input array of data, a function to call that gets passed an item of the data and should return a promise that resolves to the result of that request and they return a promise that resolves to an array of data in the original array order (same return value as Promise.all()).

jfriend00
  • 683,504
  • 96
  • 985
  • 979
  • Thanks your explain. The reason I don't combine them in one request is I have a progress bar showing the status of each one when finish. So I need to separate each. I will check what you mentioned. – Moon Jun 03 '21 at 12:36