11

I am trying to upload a files to the s3 bucket,The following code I am using to accomplish this operation.

var params = {
  localFile: "../Processor/1.wav",

  s3Params: {
    Bucket: "bucketname",
    Key: "1.wav",
  },
};
var uploader = client.uploadFile(params);
uploader.on('error', function(err) {
  console.error("unable to upload:", err.stack);
});
uploader.on('progress', function() {
  console.log("progress", uploader.progressMd5Amount,
            uploader.progressAmount, uploader.progressTotal);
});
uploader.on('end', function() {
  console.log("done uploading");
});

Every thing works fine till this point. Now lets say here, I want to upload 5 files from local path to s3 bucket, How can I achieve that is there any direct method providing amazon for multiple file uploads or I need to use async module.?

Kishore Indraganti
  • 1,296
  • 3
  • 17
  • 34

1 Answers1

9

The NodeJS AWS SDK doesn't have any bulk S3 upload method, I'd suggest you to use async/await to upload multiple files at once. Here's an example:

const s3 = new AWS.S3()
const params = [
  { Bucket: 'bucket', Key: 'key', Body: 'body' },
  { Bucket: 'bucket', Key: 'key2', Body: 'body2' },
  { Bucket: 'bucket', Key: 'key3', Body: 'body3' }
]
const responses = await Promise.all(
  params.map(param => s3.upload(param).promise())
)
console.log(responses)
Scaccoman
  • 455
  • 7
  • 16
  • 1
    Do you know how to process progress with this implementation? – Marcelo Formentão Oct 19 '20 at 00:17
  • 1
    @MarceloFormentão `Promise.all` doesn't make it very straight forward to process progress but it is indeed possible, please check out the answers to this question: https://stackoverflow.com/questions/42341331/es6-promise-all-progress – Scaccoman Oct 20 '20 at 09:16