I have been playing around with the following code trying to reduce my current AWS uploader to a few lines of code. This code works great but I really need a progress indicator to monitor upload progress as for larger files or folder uploads the users has no idea how long its going to take.
I am new with promise await etc my current code uses callbacks.
I see this code here. ES6 Promise.all progress
Is it possible to get a callback for each file uploaded successfully? this way I can just get the total length of all the files to upload and output a progress based on that with an increment.
Here is my current code.
let AWS = require("aws-sdk");
const fs = require('fs');
const path = require('path');
const s3 = new AWS.S3();
const params = [];
fs.readdirSync('/Users/dave/Desktop/uploadit').forEach(file => {
const {
ext,
name,
dir
} = path.parse(file);
let fl = name.replace(/[^a-z0-9/]/gi, '_').toLowerCase();
params.push({
Bucket: 's3b-south-amercia',
Key: fl + ext,
Body: fs.readFileSync('/Users/dave/Desktop/uploadit/' + file)
});
});
var total_files = params.length;
// Start function
const start = async function(a, b) {
console.time('test');
const responses = await Promise.all(
params.map(param => s3.upload(param).promise())
);
console.timeEnd('test');
}
// Call start
start();
// time aws s3 cp /Users/dave/Desktop/uploadit s3://s3b-south-amercia/ --recursive
// AWS CLI 26.295 / 24.684s
// AWS NODE 27.859s / 20.483
I have tested the code against the AWS cli line cp command and they both have similar upload times.
Any help or point me in the right direction on how to implement a callback with promise all would be greatly appreciated maybe promise all is not designed to do this I am not sure.
Thanks
Working code thanks to @nobody ;)
let AWS = require("aws-sdk");
const fs = require('fs');
const path = require('path');
const s3 = new AWS.S3();
const params = [];
fs.readdirSync('/Users/dave/Desktop/uploadit').forEach(file => {
const {
ext,
name,
dir
} = path.parse(file);
let fl = name.replace(/[^a-z0-9/]/gi, '_').toLowerCase();
params.push({
Bucket: 's3b-south-amercia',
Key: fl + ext,
Body: fs.readFileSync('/Users/dave/Desktop/uploadit/' + file)
});
});
var progress = 1;
var total_files = params.length;
console.log('total_files', total_files);
const onProgress = async promise => {
const result = await promise;
console.log(progress / total_files * 100 + '%');
progress++;
return result;
};
// Start function
const start = async function(a, b) {
console.time('test');
const responses = await Promise.all(
params.map(param => onProgress(s3.upload(param).promise()))
);
/*const responses = await Promise.all(
params.map(param => s3.upload(param).promise())
);*/
console.timeEnd('test');
}
// Call start
start();
// time aws s3 cp /Users/dave/Desktop/uploadit s3://s3b-south-amercia/ --recursive
// AWS CLI 26.295 / 24.684s
// AWS NODE 27.859s / 20.483
UPDATE:
Just to inform anyone using this code when you run this on a folder with a large amount of files in you will almost certainly come across this error.
RequestTimeTooSkewed: The difference between the request time and the current time is too large
I have created a ticket here and a workaround.
https://github.com/aws/aws-sdk-js/issues/3757
But hopefully a fix will be applied to the SDK soon.