27

I am trying to move images for my site from my host to Amazon S3 cloud hosting. These images are of client work sites and cannot be publicly available. I would like them to be displayed on my site preferably by using the PHP SDK available from Amazon.

So far I have been able to script for the conversion so that I look up records in my database, grab the file path, name it appropriately, and send it to Amazon.

    //upload to s3
$s3->create_object($bucket, $folder.$file_name_new, array(
    'fileUpload' => $file_temp,
    'acl' => AmazonS3::ACL_PRIVATE, //access denied, grantee only own
    //'acl' => AmazonS3::ACL_PUBLIC, //image displayed
    //'acl' => AmazonS3::ACL_OPEN, //image displayed, grantee everyone has open permission
    //'acl' => AmazonS3::ACL_AUTH_READ, //image not displayed, grantee auth users has open permissions
    //'acl' => AmazonS3::ACL_OWNER_READ, //image not displayed, grantee only ryan
    //'acl' => AmazonS3::ACL_OWNER_FULL_CONTROL, //image not displayed, grantee only ryan
    'storage' => AmazonS3::STORAGE_REDUCED
    )
    );

Before I copy everything over, I have created a simple form to do test upload and display of the image. If I upload an image using ACL_PRIVATE, I can either grab the public url and I will not have access, or I can grab the public url with a temporary key and can display the image.

<?php
//display the image link
$temp_link = $s3->get_object_url($bucket, $folder.$file_name_new, '1 minute');
?>
<a href='<?php echo $temp_link; ?>'><?php echo $temp_link; ?></a><br />
<img src='<?php echo $temp_link; ?>' alt='finding image' /><br />

Using this method, how will my caching work? I'm guessing every time I refresh the page, or modify one of my records, I will be pulling that image again, increasing my get requests.

I have also considered using bucket policies to only allow image retrieval from certain referrers. Do I understand correctly that Amazon is supposed to only fetch requests from pages or domains I specify?

I referenced: https://forums.aws.amazon.com/thread.jspa?messageID=188183&#188183 to set that up, but then am confused as to which security I need on my objects. It seemed like if I made them Private they still would not display, unless I used the temp link like mentioned previously. If I made them public, I could navigate to them directly, regardless of referrer.

Am I way off what I'm trying to do here? Is this not really supported by S3, or am I missing something simple? I have gone through the SDK documentation and lots of searching and feel like this should be a little more clearly documented so hopefully any input here can help others in this situation. I've read others who name the file with a unique ID, creating security through obscurity, but that won't cut it in my situation, and probably not best practice for anyone trying to be secure.

Bob
  • 523
  • 1
  • 5
  • 10

4 Answers4

25

The best way to serve your images is to generate a url using the PHP SDK. That way the downloads go directly from S3 to your users.

You don't need to download via your servers as @mfonda suggested - you can set any caching headers you like on S3 objects - and if you did you would be losing some major benefits of using S3.

However, as you pointed out in your question, the url will always be changing (actually the querystring) so browsers won't cache the file. The easy work around is simply to always use the same expiry date so that the same querystring is always generated. Or better still 'cache' the url yourself (eg in the database) and reuse it every time.

You'll obviously have to set the expiry time somewhere far into the future, but you can regenerate these urls every so often if you prefer. eg in your database you would store the generated url and the expiry date(you could parse that from the url too). Then either you just use the existing url or, if the expiry date has passed, generate a new one. etc...

Geoff Appleford
  • 18,538
  • 4
  • 62
  • 85
  • I'm out of votes for today, but this is correct. Downloading the images yourself first actually would be slower than just hosting them yourself, thus throwing away the advantage of external hosting on S3. – Konerak Mar 03 '11 at 14:26
  • This makes perfect sense but seems like a more complicated security through obscurity model. I could be just as well off not using S3 and using a .htaccess file to disable hotlinking and using a proxy file to display my image, almost like what @mfonda suggested, just locally. So, using the SDK, I can't just grab the file based on the authentication values I've set in config.inc.php? I guess I'm not really understanding how to grab a file as an authenticated user. Any input on bucket policies so that the images are public, but only available to referrers I define? That didn't seem to work. – Bob Mar 04 '11 at 15:53
  • @Bob - Its far more secure than that. The url cannot be created unless the person has your secret key. Of course it might be simpler to server the files from local storage, but then you lose all the benefits of S3 - capacity limited only by your credit card, huge bandwidth and scalability etc. You can create your own access policies. See http://docs.amazonwebservices.com/AmazonS3/latest/dev/index.html?AccessPolicyLanguage.html, but that is even more complicated and I am not sure it'll do what you want anyway. – Geoff Appleford Mar 04 '11 at 16:33
  • 1
    Okay, the benefits are definitely huge for using S3. I did some more stuff with bucket policies and found this article [http://blog.cloudberrylab.com/2010/07/how-to-prevent-hotlinking-of-your.html](http://blog.cloudberrylab.com/2010/07/how-to-prevent-hotlinking-of-your.html) I must have had something configured incorrectly as this seems to be working now. The images display on my site, blazingly fast, of course, and when I click on 'Open image in new tab' I get the access denied message. Can that referrer be spoofed by someone then? – Bob Mar 04 '11 at 17:27
  • @Bob - the referrer is just an http header so it can easily be spoofed. There are even Firefox extensions to do it for you. Using referrer bucket policy will be great for stopping the casual user but its not going to stop a determined and knowledgeable user. But nice link anyway. I might use bucket policies for resources where absolute security is not essential. – Geoff Appleford Mar 05 '11 at 09:31
  • @Geoff, thanks for the clarity. That's what I figured, and did see the Firefox stuff out there after a little research. I think a combo of dynamic urls and bucket referrer policy will suit us well. Thanks for your input. – Bob Mar 07 '11 at 15:19
10

You can use bucket policies in your Amazon bucket to allow your application's domain to access the file. In fact, you can even add your local dev domain (ex: mylocaldomain.local) to the access list and you will be able to get your images. Amazon provides sample bucket policies here: http://docs.aws.amazon.com/AmazonS3/latest/dev/AccessPolicyLanguage_UseCases_s3_a.html. This was very helpful to help me serve my images.

The policy below solved the problem that brought me to this SO topic:

    {
       "Version":"2008-10-17",
       "Id":"http referer policy example",
       "Statement":[
    {
      "Sid":"Allow get requests originated from www.example.com and example.com",
      "Effect":"Allow",
      "Principal":"*",
      "Action":"s3:GetObject",
      "Resource":"arn:aws:s3:::examplebucket/*",
      "Condition":{
        "StringLike":{
          "aws:Referer":[
            "http://www.example.com/*",
            "http://example.com/*"
          ]
        }
      }
    }
  ]
}
Wes
  • 399
  • 5
  • 14
2

When you talk about security and protecting data from unauthorized users, something is clear: you have to check every time you access that resource that you are entitled to.

That means, that generating an url that can be accessed by anyone (might be difficult to obtain, but still...). The only solution is an image proxy. You can do that with a php script.

There is a fine article from Amazon's blog that sugests using readfile, http://blogs.aws.amazon.com/php/post/Tx2C4WJBMSMW68A/Streaming-Amazon-S3-Objects-From-a-Web-Server

readfile('s3://my-bucket/my-images/php.gif');
catalinux
  • 1,462
  • 14
  • 26
0

You can download the contents from S3 (in a PHP script), then serve them using the correct headers.

As a rough example, say you had the following in image.php:

$s3 = new AmazonS3();
$response = $s3->get_object($bucket, $image_name);
if (!$response->isOK()) {
    throw new Exception('Error downloading file from S3');
}
header("Content-Type: image/jpeg");
header("Content-Length: " . strlen($response->body));
die($response->body);

Then in your HTML code, you can do

<img src="image.php">
mfonda
  • 7,873
  • 1
  • 26
  • 30