1

We have a web application which needs to store uploaded images with EdgeCast or Amazon S3. To do so we need to re-upload those images to EdgeCast / S3 first. There are a number of options I can think of:

  1. Do an inline curl re-upload, which will upload the image using the desired EdgeCast / S3 API. The disadvantage is that with concurrent uploads the server load will be massive.
  2. Queue re-uploads. I know it is possible but I have no idea how to do it.
  3. Don't bother with the re-uploads and upload directly to EdgeCast / S3 from the client end.
  4. Mount EdgeCast's / S3's FTP into the filesystem, and then just copy uploaded files, letting the FTP daemon do the rest.

Which is the best solution, and are there any other ones? I suspect it is either 1 or 2.

Edit: My application is in PHP

Kristina
  • 15,859
  • 29
  • 111
  • 181
  • Added the edgecast tag for you. Where are your images now? What language would you like to do the upload with (you have to call curl from somewhere, even if it's a bash script). – El Yobo Dec 16 '10 at 11:51

4 Answers4

3

Not sure about EdgeCast but I know with Amazon S3 the best way to do this was to POST the file directly to the file server. See http://doc.s3.amazonaws.com/proposals/post.html

Doing it this way you would provide a HTML FORM with some field like folder id, file name, public key, timestamp etc to ensure it was secure and only you could upload to the server. The server would redirect their browser once the upload was complete from that page they redirected to you could inspect the query string to find out if the upload was successful or not then record the FileID into your DB.

Works great for reducing load on your server and gives the user a faster experience, however it can lead to orphan files if the upload succeeds but the DB insert fails.

Daveo
  • 19,018
  • 10
  • 48
  • 71
  • Don't you have to expose your secret key for this? – Kristina Dec 16 '10 at 11:57
  • no of cause not! then it would not be secret. You just have to use your secret key to generate a HASH, then send the HASH to S3 which will use your secret key to authenticate your HASH – Daveo Dec 16 '10 at 12:03
  • Here is a tutorial for php on howto do it: http://www.ioncannon.net/programming/1539/direct-browser-uploading-amazon-s3-cors-fileapi-xhr2-and-signed-puts/ if you want to search it then search "amazon s3 CORS" and there should be allot that pops up. – Ludger Mar 29 '15 at 16:20
1

EdgeCast storage supports Rsync/sFTP configuration to automatically sync content from a storage server. EdgeCast also supports "reverse-proxy" or "Customer Origin" configuration to pull content from another web-server into cache automatically. You can use S3 as this Customer-origin location or use EdgeCast's own Cloud Storage service or any other Web-server outside the EdgeCast network.

craig
  • 11
  • 1
0

What I do is upload the files to a temp directory then have a cron script run that PUT's the files onto AWS, so as not to cause the upload process to take any longer for the end-user.

fire
  • 21,383
  • 17
  • 79
  • 114
0

AWS provides an SDK for PHP to do this. It even has support for multi-part uploads, which is great news for developers.

It should be noted that the AWS SDK for PHP will also do retry logic and (if you're using Multipart, which you should) can resume failed uploads.

Adrian Petrescu
  • 16,629
  • 6
  • 56
  • 82