4

I am node and programming in general, and I have been really struggling with this...

I want to take a https response, resize it with graphicsmagick and send it to my amazon S3 bucket.

It appears that the https res is an IncomingMessage object (I can't find any info about that) and the stdout from graphicsmagick is a Socket.

The weird thing is that I can use pipe and send both of these to a writeStream with a local path, and both res and stdout create a nice new resized image.

And I can even send res to the S3 (using knox) and it works.

But stdout doesn't want to go to the S3 :-/

Any help would be appreciated!

https.get(JSON.parse(queryResponse).data.url,function(res){

    var headers = {
        'Content-Length': res.headers['content-length']
        , 'Content-Type': res.headers['content-type']
    }

    graphicsmagick(res)
      .resize('50','50')
      .stream(function (err, stdout, stderr) {

        req = S3Client.putStream(stdout,'new_resized.jpg', headers, function(err, res){
        })
        req.end()
    })

})

knox - for connecting to S3 – https://github.com/LearnBoost/knox graphicsmagick - for image manipulation – https://github.com/aheckmann/gm

ben75
  • 29,217
  • 10
  • 88
  • 134
p0pps
  • 586
  • 1
  • 4
  • 13

3 Answers3

4

The problem was with the fact that Amazon needs to know content length before hand (thanks DarkGlass)

However, since my images are relatively small I found buffering preferential to MultiPartUpload.

My solution:

https.get(JSON.parse(queryResponse).data.url,function(res){

    graphicsmagick(res)
      .resize('50','50')
      .stream(function (err, stdout, stderr) {

        ws. = fs.createWriteStream(output)

        i = []

        stdout.on('data',function(data){
          i.push(data)
        })

        stdout.on('close',function(){
          var image = Buffer.concat(i)

          var req = S3Client.put("new-file-name",{
             'Content-Length' : image.length
            ,'Content-Type' : res.headers['content-type']
          })

          req.on('response',function(res){  //prepare 'response' callback from S3
            if (200 == res.statusCode)
              console.log('it worked')
          })
          req.end(image)  //send the content of the file and an end
        })
    })
})
p0pps
  • 586
  • 1
  • 4
  • 13
  • 1
    You are totally right, I missed the `content-length` part in my answer. Just to note, rather that use `Buffer.concat`, it would be more performant to calculate a sum as you get `data` events, and then call `req.write` for each chunk in `i`. That way you aren't doing a ton of unneeded copying to concatenate everything. – loganfsmyth Feb 05 '13 at 05:53
2

You appear to be setting Content-Length from the original image and not the resized one

Maybe this helps

get a stream's content-length

https://npmjs.org/package/knox-mpu

Community
  • 1
  • 1
DarkGlass
  • 21
  • 2
0

You shouldn't be doing req.end() there. By doing that, you will close the stream to S3 before it has had time to send the image data. It will end itself automatically when all of the image data has been sent.

loganfsmyth
  • 156,129
  • 30
  • 331
  • 251