0

My Current flow that I am following for image upload and its manipulation on the S3 bucket is as follows:

  1. Image is uploaded into an S3 Bucket.
  2. Retrieving the image buffer of the saved image in the S3 Bucket.
  3. Resizing of the image buffer using gm and imagemagick npm module.
  4. Saving the resized image buffer with the original extension as a new image in the S3 Bucket.

As the image size increases so those the API response time.It takes 3 secs for the API to give a response while following above workflow for a 300 KB image and 30 secs for a 15 MB or more image.

I don't know if my flow is correct or not. My requirement is that I need to reduce my API response time while uploading a 15 or more MB image. Can somebody suggest me how to achieve this?

EDIT

To make it clear I am uploading a single 15 MB file which I am sending it from Advanced Rest Client (a chrome plugin) to my API. The image file is currently present in my local system. I am currently working on jpg and png images. I want to resize the image to both bigger and smaller sizes by reducing the width to height ratio. The API that I Have written is in Node JS which follows the above workflow.

shubhamagiwal92
  • 1,362
  • 4
  • 25
  • 47
  • 1
    Your question is not very clear. You want to send a 15MB image to S3 (from where?), and process it on S3 (or somewhere else?). You want to resize it - bigger or smaller? JPEG or PNG? by reducing the quality or the dimensions? You then want to write it back to S3. There is an API involved somewhere - what API ? an aws-cli API? a web API for some unknown website? You are surprised that big images take longer? Do you have multiple images or just one? – Mark Setchell Sep 03 '15 at 16:36
  • @Mark Setchell I have edited my Post to answer your questions. Please provide ur valuable suggestion to tackle this problem. – shubhamagiwal92 Sep 04 '15 at 04:23

1 Answers1

0

Seems like upload and download big image from S3 takes a lot of time.

You can do some optimizations:

  • Try to use graphicsmagick package, in common cases it's faster that imagemagick (latest versions)
  • Make resize / optimize operations before uploading to S3 and upload at last step

  • If it's not possible - try to use Streams for downloading - resizing - uploading to S3. just pipe request stream to gm module and write pipe to AWS S3

  • Make image optimizations from Amazon EC2 infastructure, it's having very high network perfomance with S3
vmkcom
  • 1,630
  • 11
  • 17
  • Hi Vmkcom, Do you have any idea how do I take a local image, convert it into a Buffer or stream and then Resize it based on my requirement using image Magik? – shubhamagiwal92 Sep 06 '15 at 03:55