143

Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.

what's the goal:

  1. client sends a canvas datauri (png) to server (via socket.io)
  2. server uploads image to amazon s3

step 1 is done.

the server now has a string a la

data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...

my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?

knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?

Any ideas, pointers and feedback welcome.

ROMANIA_engineer
  • 54,432
  • 29
  • 203
  • 199
Franz Enzenhofer
  • 3,666
  • 5
  • 19
  • 30
  • 4
    Check this answer: http://stackoverflow.com/questions/5867534/how-to-save-canvas-data-to-file/5971674#5971674 – akirk Sep 22 '11 at 12:28

5 Answers5

300

For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );

Inside your router method (ContentType should be set to the content type of the image file):

  var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
  var data = {
    Key: req.body.userId, 
    Body: buf,
    ContentEncoding: 'base64',
    ContentType: 'image/jpeg'
  };
  s3Bucket.putObject(data, function(err, data){
      if (err) { 
        console.log(err);
        console.log('Error uploading data: ', data); 
      } else {
        console.log('successfully uploaded the image!');
      }
  });

s3_config.json file :

{
  "accessKeyId":"xxxxxxxxxxxxxxxx",
  "secretAccessKey":"xxxxxxxxxxxxxx",
  "region":"us-east-1"
}
Debug Diva
  • 26,058
  • 13
  • 70
  • 123
Divyanshu Das
  • 3,698
  • 4
  • 23
  • 27
  • 2
    [MissingRequiredParameter: Missing required key 'Key' in params] – Nichole A. Miler Jan 10 '16 at 12:41
  • 1
    Key: req.body.userId I used userId as key in post data... it was long back... but you can declare any string as key. To make sure already present files are not overwritten keep the key unique. – Divyanshu Das Jan 11 '16 at 06:26
  • @Divyanshu Thanks for such useful example. I got two doubt: `How to make S3 generates a unique KEY to prevent from overriding files?` and `If I don't set the ContentType, when I download the files I won't be able to get the correct file?` I mean, I will get such a corrupted file? Thanks in advance! – alexventuraio Jun 14 '16 at 22:25
  • @Lexynux unique key is something you have to generate on your system. something like:- aws_key : req.user.username + uuid(Math.round((new Date()).getTime() / 1000)) you can add more randomness to it to make sure that it does not clash. There is no specific rule here. – Divyanshu Das Jun 15 '16 at 06:58
  • @Lexynux about content type, try to set content type beforehand, you can find code snippets in node to determine content-type of a file. not setting content-type might lead to corrupted files, I haven't verified this since I always set content-type before posting to s3 – Divyanshu Das Jun 15 '16 at 07:01
  • Alright @Divyanshu I will try to do so and if I need some more help I will write you back! Thanks a lot from México City! – alexventuraio Jun 15 '16 at 15:51
  • @Divyanshu Buffer object will throw an error "Buffer not define" can you give me solution for that – NaveenG Jul 05 '16 at 09:47
  • @NaveenG You should not be getting that since Buffer class is part of javascript language. You might wanna check your javascript and node installation. – Divyanshu Das Jul 07 '16 at 07:14
  • @NaveenG Did you solve your error? I also face he same error – Krishna Dec 03 '16 at 11:25
  • @Divyanshu Could you please help with the video upload as well? – Pushkar Kathuria Jan 27 '17 at 23:13
  • @PushkarKathuria, The above approach should work for video as well. You just need to set the correct mime_type and base64 for video to read in buffer. For video base64 should take format like this - data:video/mp4;base64 – Divyanshu Das Jan 28 '17 at 21:25
  • From the docs it appears that the `upload` method returns the `location` in `data` but the `putObject` method does not. Do you know how to get the new S3 location path after a successful `putObject`? – Marklar Jul 24 '17 at 23:54
  • 2
    @Marklar location path is basically key - e.g. if your bucket name is - bucketone and key name is xyz.png, then file path will be https://bucketone.s3.amazonaws.com/xyz.png – Divyanshu Das Jul 25 '17 at 06:15
  • Thanks a lot ...I was confused with file name here the key is the filename and contain path as well. For example "/img/profile/userid.jpg" is work like a charm for me. – Manish Aug 04 '17 at 07:51
  • 4
    @Divyanshu Thanks for this great answer! It helped me a lot. However, I think `ContentEncoding: 'base64'` is not correct because `new Buffer(..., 'base64')` decodes base64-encoded string into its binary representation. – Shuhei Kagawa Nov 21 '17 at 10:31
  • Hey @Divyanshu it's working, but I have a problem. Image is missing pixels in half of the portion. So image is not completely viewable. My decoded string is starting from - /9j/4QO... after stripping those "data:image" and all. – Meet Zaveri Feb 27 '18 at 09:05
  • @Divyanshu Also `replace(/^data:image\/\w+;base64,/, "")` is not correctly finding data:image – Meet Zaveri Feb 27 '18 at 09:11
  • @MeetZaveri, that's weird, this code still works for me. I used it in a new project recently. I am not sure how could I help more. Are you sure there is no issue with image or other settings ? – Divyanshu Das Mar 15 '18 at 06:26
  • @Divyanshu it was done days before. No issues now. Your sol. provided an assist – Meet Zaveri Mar 15 '18 at 08:16
  • Using the `Buffer` like that also works with the `upload()` function, too. – Pistos Apr 30 '18 at 19:36
  • Thank you thank you thank you!!!!!! I've been spending weeks on this and finally a straightforward solution. – Ka Tech May 05 '18 at 05:47
  • 1
    Yes to what @ShuheiKagawa said; `ContentEncoding` is not necessary. Also, according to the latest Node.js docs, that form of the `Buffer` constructor is deprecated, replaced by `Buffer.from(..., 'base64')`. – Adam Florin Oct 29 '18 at 17:54
  • Yes, @ShuheiKagawa is correct, ContentEncoding is unnecessary and actually caused my image urls served from S3 to be invalid for the Facebook API. You can test your urls here.. https://developers.facebook.com/tools/debug/sharing – delux247 Feb 16 '19 at 20:20
  • Just needed the ContentType, not sure why this too so long to find. AWS docs aren't the easiest to navigate. – thehme May 14 '19 at 12:39
  • 1
    @DivyanshuDas can i use .split(/base64,/)[1] insted of .replace(/^data:image\/\w+;base64,/, "") ? because i want to upload any file. i am using AWS lambda and can't find any other method to upload file via lambda function. – Raj Thakar Jun 21 '19 at 05:25
  • Yes! `Body` should be a `Buffer` object. not base64 string! – NFT Master Jul 17 '20 at 03:36
  • Working !! ........ this fixed the issue `.replace(/^data:image\/\w+;base64,/, "")` !! – Janen R Apr 09 '21 at 11:24
  • For loading credentials in node.js app: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/setting-credentials-node.html – Sushil Thapa Oct 31 '21 at 09:47
24

Here's the code from one article I came across, posting below:

const imageUpload = async (base64) => {

  const AWS = require('aws-sdk');

  const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;

  AWS.config.setPromisesDependency(require('bluebird'));
  AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });

  const s3 = new AWS.S3();

  const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');

  const type = base64.split(';')[0].split('/')[1];

  const userId = 1;

  const params = {
    Bucket: S3_BUCKET,
    Key: `${userId}.${type}`, // type is not required
    Body: base64Data,
    ACL: 'public-read',
    ContentEncoding: 'base64', // required
    ContentType: `image/${type}` // required. Notice the back ticks
  }

  let location = '';
  let key = '';
  try {
    const { Location, Key } = await s3.upload(params).promise();
    location = Location;
    key = Key;
  } catch (error) {
  }

  console.log(location, key);

  return location;

}

module.exports = imageUpload;

Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property

Credits: https://medium.com/@mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f

Harshal Yeole
  • 4,812
  • 1
  • 21
  • 43
19

ok, this one is the answer how to save canvas data to file

basically it loos like this in my code

buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')


req = knoxClient.put('/images/'+filename, {
             'Content-Length': buf.length,
             'Content-Type':'image/png'
  })

req.on('response', (res) ->
  if res.statusCode is 200
      console.log('saved to %s', req.url)
      socket.emit('upload success', imgurl: req.url)
  else
      console.log('error %d', req.statusCode)
  )

req.end(buf)
Community
  • 1
  • 1
Franz Enzenhofer
  • 3,666
  • 5
  • 19
  • 30
11

The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:

/^data:.+;base64,/

Ms01
  • 4,420
  • 9
  • 48
  • 80
-1

For laravel developers this should work

The $uploadFile is the base64 encoded string you want to upload to S3 server. While the $fileName should contain the file extension e.g: filename.png. make sure this correspond to data:image/{filetype} of the base64 encode.


/* upload the file  */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");

make sure to set your .env file property before calling this method

ObiTech Invents
  • 408
  • 3
  • 6