2

I want to upload a file from my frontend to my Amazon S3 (AWS).

I'm using dropzone so I convert my file and send it to my backend.

In my backend my file is like:

{ fieldname: 'file',
originalname: 'test.torrent',
encoding: '7bit',
mimetype: 'application/octet-stream',
buffer: { type: 'Buffer', data: [Array] },
size: 7449 },

and when I try to upload my file with my function:

var file = data.patientfile.file.buffer;

        var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: file };

        s3.upload(params, function (err, data) {
            if (err) {
                console.log("******************",err)
            } else {
                console.log("Successfully uploaded data to myBucket/myKey");
            }
        });

I get as error:

Unsupported body payload object

Do you know how can I send my file?

I have tried to send it with putobject and get a similar error.

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
tetar
  • 682
  • 1
  • 10
  • 25
  • Possible duplicate of [Upload a file to Amazon S3 with NodeJS](https://stackoverflow.com/questions/28018855/upload-a-file-to-amazon-s3-with-nodejs) – PsyGik Jul 19 '19 at 11:39
  • i don't have path for my file so this example is not working . all my data are in buff – tetar Jul 19 '19 at 11:41

3 Answers3

12

I think you might need to convert the file content (which probably in this case is the data.patientfile.file.buffer) to binary

var base64data = new Buffer(data, 'binary');

so the params would be like:

var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: base64data };

Or if I'm mistaken and the buffer is already in binary, then you can try:

var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: data.patientfile.file.buffer};
wakakak
  • 842
  • 5
  • 13
2

This is my production code that is working.

Please note the issue can happen at data1111.

But, to get full idea, add all key parts of working code below.

client:

// html

<input
  type="file"
  onChange={this.onFileChange}
  multiple
/>


// javascript

onFileChange = event => {
    const files = event.target.files;
    var file = files[0];
    var reader = new FileReader();
    reader.onloadend = function(e) {

        // save this data1111 and send to server
        let data1111 = e.target.result // reader.result // ----------------- data1111

    };
    reader.readAsBinaryString(file);
}

server:

// node.js/ javascript

const response = await s3
  .upload({
    Bucket: s3Bucket, // bucket
    Key: s3Path, // folder/file
    // receiving at the server - data1111 - via request body (or other)
    Body: Buffer.from(req.body.data1111, "binary") // ----------------- data1111
  })
  .promise();
return response;

To make the above code working, it took full 2 days.

Hope this helps someone in future.

Manohar Reddy Poreddy
  • 25,399
  • 9
  • 157
  • 140
2

Implemented Glen k's answer with nodejs ...worked for me

   const AWS = require('aws-sdk');

   const s3 = new AWS.S3({
    accessKeyId: process.env.AWSAccessKeyID,
    secretAccessKey: process.env.AWSSecretAccessKey,
});
     
let base64data = Buffer.from(file.productImg.data, 'binary')

const params = {
  Bucket: BUCKET_NAME,
  Key: KEY, 
  Body: base64data
   }

  s3.upload(params, function(err, data) {
if (err) {
  console.log(err)
    throw err;
}
console.log(data)
console.log(`File uploaded successfully. ${data.Location}`);
})
pryme0
  • 141
  • 1
  • 9