16

Good day guys.

I have a simple question: How do I download an image from a S3 bucket to Lambda function temp folder for processing? Basically, I need to attach it to an email (this I can do when testing locally).

I have tried:

s3.download_file(bucket, key, '/tmp/image.png')

as well as (not sure which parameters will help me get the job done):

s3.getObject(params, (err, data) => {
    if (err) {
        console.log(err);
        const message = `Error getting object ${key} from bucket ${bucket}.`;
        console.log(message);
        callback(message);
    } else {

        console.log('CONTENT TYPE:', data.ContentType);
        callback(null, data.ContentType);
    }
});

Like I said, simple question, which for some reason I can't find a solution for.

Thanks!

JBM
  • 463
  • 2
  • 5
  • 16

4 Answers4

15

You can get the image using the aws s3 api, then write it to the tmp folder using fs.

var params = {   Bucket: "BUCKET_NAME",   Key: "OBJECT_KEY" };  

s3.getObject(params, function(err, data){   if (err) {
    console.error(err.code, "-", err.message);
    return callback(err);   }

  fs.writeFile('/tmp/filename', data.Body, function(err){
    if(err)
      console.log(err.code, "-", err.message);

    return callback(err);   
  }); 
});

Out of curiousity, why do you need to write the file in order to attach it? It seems kind of redundant to write the file to disk so that you can then read it from disk

Jonathan Seed
  • 1,957
  • 16
  • 21
  • I am using SendGrid, so it seems mandatory for the file to exist on the local disk to attach it. Are you saying that this is not necessarily required? – JBM Aug 16 '16 at 21:52
  • I am not familiar at all with SendGrid so I am not sure, but my thoughts were that you would download it into memory, thought this could be an issue depending on file size. – Jonathan Seed Aug 23 '16 at 14:09
  • @JonathanSeed I am actually having this problem that I am reading a 150-200mb text file with `getObject` and this makes my Lambda function to reach its maximum memory limit. Is there a workaround or why is the memory limit so low? – V. Samma Sep 28 '16 at 13:28
  • @V.Samma You can configure the memory for the lambda function under Advanced Settings in the console. I believe the default value is 128 mb. – Jonathan Seed Sep 28 '16 at 20:42
  • @JonathanSeed I may not have been clear enough. By reaching its maximum memory limit I meant that I have already set the maximum memory for my lambda function, which is 1536MB and this limit is reached when my lambda function tries to read in 2 files (one a few KB-s and one 150-200MB-s) and then concatenate them as string values and write the result back to S3. – V. Samma Sep 29 '16 at 13:27
14

If you're writing it straight to the filesystem you can also do it with streams. It may be a little faster/more memory friendly, especially in a memory-constrained environment like Lambda.

var fs = require('fs');
var path = require('path');

var params = {
    Bucket: "mybucket",
    Key: "image.png"
};

var tempFileName = path.join('/tmp', 'downloadedimage.png');
var tempFile = fs.createWriteStream(tempFileName);

s3.getObject(params).createReadStream().pipe(tempFile);
Seafish
  • 2,081
  • 2
  • 24
  • 41
  • do you know if `createReadStream()` make `aws-sdk` firing multiple GET calls or it does only one and then stream the data? I am concerned about the cost of this solution – Cinn Mar 27 '19 at 10:40
  • what is path in this? – Vikas Satpute Mar 24 '20 at 12:39
  • @Cinn I believe it should fire just one GET call. It does the same thing as a regular getObject but just exposes the underlying stream – Seafish Mar 24 '20 at 13:49
  • @VikasSatpute I edited to add path, thanks for calling that out – Seafish Mar 24 '20 at 13:50
  • This is for v2 of the SDK. For v3 it's a little simpler since the response body is already a readable stream (see this answer: https://stackoverflow.com/a/67373050/3312114) – Seafish Jul 18 '22 at 16:19
  • How do you get the metadata or other properties on the response in this situation? – bobbyg603 Jan 02 '23 at 21:09
3
// Using NodeJS version 10.0 or later and promises

const fsPromise = require('fs').promises;

try {
    const params = {
        Bucket: 's3Bucket',
        Key: 'file.txt',
    };

    const data = await s3.getObject(params).promise();

    await fsPromise.writeFile('/tmp/file.txt', data.Body);

} catch(err) {
    console.log(err);
}
0

I was having the same problem, and the issue was that I was using Runtime.NODEJS_12_X in my AWS lambda.

When I switched over to NODEJS_14_X it started working for me :').

Also

The /tmp is required. It will directly write to /tmp/file.ext.

treckstar
  • 1,956
  • 5
  • 21
  • 26