0

I have been playing around with the following code trying to reduce my current AWS uploader to a few lines of code. This code works great but I really need a progress indicator to monitor upload progress as for larger files or folder uploads the users has no idea how long its going to take.

I am new with promise await etc my current code uses callbacks.

I see this code here. ES6 Promise.all progress

Is it possible to get a callback for each file uploaded successfully? this way I can just get the total length of all the files to upload and output a progress based on that with an increment.

Here is my current code.

let AWS = require("aws-sdk");

const fs = require('fs');

const path = require('path');

const s3 = new AWS.S3();

const params = [];

fs.readdirSync('/Users/dave/Desktop/uploadit').forEach(file => {

    const {
        ext,
        name,
        dir
    } = path.parse(file);

    let fl = name.replace(/[^a-z0-9/]/gi, '_').toLowerCase();

    params.push({ 
        Bucket: 's3b-south-amercia', 
        Key: fl + ext, 
        Body: fs.readFileSync('/Users/dave/Desktop/uploadit/' + file)
    });

});

var total_files = params.length;

// Start function
const start = async function(a, b) {
    
    console.time('test');

    const responses = await Promise.all(
        params.map(param => s3.upload(param).promise())
    );

    console.timeEnd('test');

}

// Call start
start();

// time aws s3 cp /Users/dave/Desktop/uploadit s3://s3b-south-amercia/ --recursive
// AWS CLI 26.295 / 24.684s
// AWS NODE 27.859s / 20.483

I have tested the code against the AWS cli line cp command and they both have similar upload times.

Any help or point me in the right direction on how to implement a callback with promise all would be greatly appreciated maybe promise all is not designed to do this I am not sure.

Thanks

Working code thanks to @nobody ;)

let AWS = require("aws-sdk");

const fs = require('fs');

const path = require('path');

const s3 = new AWS.S3();

const params = [];

fs.readdirSync('/Users/dave/Desktop/uploadit').forEach(file => {

    const {
        ext,
        name,
        dir
    } = path.parse(file);

    let fl = name.replace(/[^a-z0-9/]/gi, '_').toLowerCase();

    params.push({ 
        Bucket: 's3b-south-amercia', 
        Key: fl + ext, 
        Body: fs.readFileSync('/Users/dave/Desktop/uploadit/' + file)
    });

});

var progress = 1;

var total_files = params.length;

console.log('total_files', total_files);

const onProgress = async promise => {

    const result = await promise;

    console.log(progress / total_files * 100 + '%');

    progress++;
    
    return result;

};

// Start function
const start = async function(a, b) {
    
    console.time('test');

    const responses = await Promise.all(
        params.map(param => onProgress(s3.upload(param).promise()))
    );
    /*const responses = await Promise.all(
        params.map(param => s3.upload(param).promise())
    );*/

    console.timeEnd('test');

}

// Call start
start();

// time aws s3 cp /Users/dave/Desktop/uploadit s3://s3b-south-amercia/ --recursive
// AWS CLI 26.295 / 24.684s
// AWS NODE 27.859s / 20.483

UPDATE:

Just to inform anyone using this code when you run this on a folder with a large amount of files in you will almost certainly come across this error.

RequestTimeTooSkewed: The difference between the request time and the current time is too large

I have created a ticket here and a workaround.

https://github.com/aws/aws-sdk-js/issues/3757

But hopefully a fix will be applied to the SDK soon.

user1503606
  • 3,872
  • 13
  • 44
  • 78
  • when you call a async function it returns a promise which will be resolved sometime in future. Using Promise.All() you can combine all promises – Rahul Kumar May 07 '21 at 10:14

2 Answers2

2

You can wrap the s3.upload with your own function, as long as that returns the original response you'll have a hook into each of the promises inside the Promise.all.

const onDone = async promise => {
  const result = await promise;
  console.log("File upload complete");
  // Or add +1 to some variable to calculate percentage of completed uploads
  return result;
};

const responses = await Promise.all(
  params.map(param => onDone(s3.upload(param).promise()))
);
Nobody
  • 36
  • 1
0

Here's an example of using a custom promise lib that supports advanced progress capturing out of the box (concurrency limit was added just for demo): Live sandbox

import { CPromise } from "c-promise2";

(async () => {
  const results = await CPromise.all(
    [
      "filename1.txt",
      "filename2.txt",
      "filename3.txt",
      "filename4.txt",
      "filename5.txt",
      "filename6.txt",
      "filename7.txt"
    ],
    {
      async mapper(filename) {
        console.log(`load and push file [${filename}]`);
        // your async code here to upload a single file
        return CPromise.delay(1000, `operation result for [${filename}]`);
      },
      concurrency: 2
    }
  ).progress((p) => {
    console.warn(`Uploading: ${(p * 100).toFixed(1)}%`);
  });

  console.log(results);
})();
Dmitriy Mozgovoy
  • 1,419
  • 2
  • 8
  • 7