0

I have to split a big file into hundreds of uploadParts and upload those uploadParts to the server.

How do I control the number of upload request with 5 requests at most simultaneous?

this.uploadFileChunks.map(uploadPart => doUpload)...
newBike
  • 14,385
  • 29
  • 109
  • 192
  • What have you tried so far? [`async.mapLimit`](https://caolan.github.io/async/v3/docs.html#mapLimit) may be of help. – Victor Dec 03 '19 at 18:21
  • Does this answer your question? [How to run "x" promises in parallel Javascript](https://stackoverflow.com/questions/52022827/how-to-run-x-promises-in-parallel-javascript) – SouXin Dec 03 '19 at 18:24

3 Answers3

1

This is similar to throttling. You should be able to find a few libraries in npm by searching for "throttle promise" or "promise limit". These are a few I found:

Also, here's a simple implementation:

// only allows a function to be run a certain number of times at once. if
// that number is reached, will queue the other function calls and wait
// until a spot opens up.
//
// n (int): number of promises to run concurrently
// f (function): must return a Promise.
const nConcurrent = (n, f) => {
    let numRunning = 0
    let queue = []

    const runOne = ({args, resolve, reject}) => {
        numRunning++

        return f.apply(null, args)
            .then(result => {
                numRunning--

                if(queue.length) {
                    runOne(queue.pop())
                }

                resolve(result)
            })
            .catch(err => {
                if(queue.length) {
                    runOne(queue.pop())
                }

                reject(err)
            })
    }

    return (...args) => {
        return new Promise((resolve, reject) => {
            if(numRunning >= n) {
                queue.push({args, resolve, reject})
            }
            else {
                runOne({args, resolve, reject})
            }
        })
    }
}

Your solution would then look like:

const doUploadLimited = nConcurrent(5, doUpload)
this.uploadParts.map(doUploadLimited)
Cully
  • 6,427
  • 4
  • 36
  • 58
  • Kind of confused about the complex logic. Let me spend more time on ur solution. thanks a lot – newBike Dec 04 '19 at 23:19
  • @newBike I just put that there as an example. Though I have unit tested the above, you're probably better off using one of the libraries I linked to or another one you find on npm. – Cully Dec 05 '19 at 04:04
0

If you want to limit the uploads to 5 simultaneous requests, you could perhaps use async.mapLimit; example:

async.mapLimit(this.uploadParts, 5, (part, onDone) => {
  axios.post(/* ... */) // or whatever API you want to use
      .then((res) => onDone(undefined, res))
      .catch(onDone)
}, (err, results) => {
  // if err is undefined, then results is an array of 
  // responses, or whatever you called your onDone callback
  // with.
});

Another approach (pretty reasonable) is to have your this.uploadParts array contain no more than 5 elements, and use async.map without a limit.

Hope this helps.

Victor
  • 13,914
  • 19
  • 78
  • 147
  • I'm so sorry. 3rd party lib is not allowed in this proj. – newBike Dec 04 '19 at 22:45
  • Then look at the source code and try to understand it. Understanding the code behind asyncjs is a very good thing on its own anyway. – Victor Dec 05 '19 at 07:10
0

How about distributing your data among a couple of bags, then start uploading each bag:

let N = 347 // total record count
let M = 5   // number of bags you want

// let me mock your data real quick...
let data = Array(N)
for(var i = 0; i < N; i++){
    data[i] = i
}

// create bags...
let bagSize = N / M
let bags = Array(M)
for(var i = 0; i < M; i++) {
    let bag = data.slice(i*bagSize, (i+1)*bagSize)
    bags[i] = bag
    // ...and use immediately
    bag.map(uploadPart => doUpload)
}

// ... or later
A. Gldmn
  • 11
  • 2