I have to split a big file into hundreds of uploadParts and upload those uploadParts to the server.
How do I control the number of upload request with 5 requests at most simultaneous?
this.uploadFileChunks.map(uploadPart => doUpload)...
I have to split a big file into hundreds of uploadParts and upload those uploadParts to the server.
How do I control the number of upload request with 5 requests at most simultaneous?
this.uploadFileChunks.map(uploadPart => doUpload)...
This is similar to throttling. You should be able to find a few libraries in npm by searching for "throttle promise" or "promise limit". These are a few I found:
Also, here's a simple implementation:
// only allows a function to be run a certain number of times at once. if
// that number is reached, will queue the other function calls and wait
// until a spot opens up.
//
// n (int): number of promises to run concurrently
// f (function): must return a Promise.
const nConcurrent = (n, f) => {
let numRunning = 0
let queue = []
const runOne = ({args, resolve, reject}) => {
numRunning++
return f.apply(null, args)
.then(result => {
numRunning--
if(queue.length) {
runOne(queue.pop())
}
resolve(result)
})
.catch(err => {
if(queue.length) {
runOne(queue.pop())
}
reject(err)
})
}
return (...args) => {
return new Promise((resolve, reject) => {
if(numRunning >= n) {
queue.push({args, resolve, reject})
}
else {
runOne({args, resolve, reject})
}
})
}
}
Your solution would then look like:
const doUploadLimited = nConcurrent(5, doUpload)
this.uploadParts.map(doUploadLimited)
If you want to limit the uploads to 5 simultaneous requests, you could perhaps use async.mapLimit
; example:
async.mapLimit(this.uploadParts, 5, (part, onDone) => {
axios.post(/* ... */) // or whatever API you want to use
.then((res) => onDone(undefined, res))
.catch(onDone)
}, (err, results) => {
// if err is undefined, then results is an array of
// responses, or whatever you called your onDone callback
// with.
});
Another approach (pretty reasonable) is to have your this.uploadParts
array contain no more than 5 elements, and use async.map
without a limit.
Hope this helps.
How about distributing your data among a couple of bags, then start uploading each bag:
let N = 347 // total record count
let M = 5 // number of bags you want
// let me mock your data real quick...
let data = Array(N)
for(var i = 0; i < N; i++){
data[i] = i
}
// create bags...
let bagSize = N / M
let bags = Array(M)
for(var i = 0; i < M; i++) {
let bag = data.slice(i*bagSize, (i+1)*bagSize)
bags[i] = bag
// ...and use immediately
bag.map(uploadPart => doUpload)
}
// ... or later