44

I have written an implementation for generating pre-signed URLS for a bucket on aws-s3. It works fine, for getting single files/objects.

How would I go about this for generating pre-signed URLS for entire directories? Lets put it this way, on my s3 bucket, there are multiple folders with their own small html5 applications. Each folder has their own set of html, css, js, as well as media files. I wouldn't be generating a pre-signed URL for single object, in this case.

If I give a pre-signed url for a single file, for example: an index.html for a folder, that file would also need to load css, js, and media files as well. Files we don't have a signed url for.

I'm just not too sure on how to go about implementing this.

nkl
  • 528
  • 1
  • 4
  • 9
  • Is this for a static website? – helloV Mar 31 '16 at 22:45
  • Yes it is. I'm aware that a pre-signed URL is only for single objects. I'm using static hosting as a workaround for now but there are directories that require authorization. Looking through the aws docs, I could create IAM users which would work nicely since I can group them, but it sounds like its for dev purposes, not necessarily for end-users. – nkl Apr 01 '16 at 16:06

6 Answers6

40

This is absolutely possible and has been for years. You must use conditions when generating a presigned URL, specifically starts-with. See the official Amazon Documentation.

As an example, here is Python with Boto3 generating a presigned POST url:

    response = s3.generate_presigned_post(
        "BUCKET_NAME",
        "uploads/${filename}",
        Fields=None,
        Conditions=[["starts-with", "$key", "uploads/"]],
        ExpiresIn=(10 * 60),
    )
  • 1
    Be cautious with your starts with key but this works! – David Merritt Jun 04 '21 at 19:30
  • 2
    May I need to specify **${filename}** on key param of that method? but I want to add multiple different files with same presined post url. I want to use `"uploads/"` instead of `"uploads/${filename}"` – A l w a y s S u n n y Jul 26 '21 at 08:30
  • @DavidMerritt Can you elaborate on why one needs to be cautious? – run_the_race Sep 28 '21 at 18:02
  • 2
    This is not dynamic. You need to run `generate_presigned_post` for each `${filename}` that you want to upload. – Marcin Mar 07 '22 at 23:51
  • 1
    How do you use `response` to open an HTML application like OP requested? Mine looks like `{'url': 'https://folder.s3.amazonaws.com/', 'fields': {'key': 'user1/reports1/index.html', 'AWSAccessKeyId': '***', 'policy': '***', 'signature': '***'}}` – Jeff Bezos Mar 23 '22 at 15:35
  • 2
    Isn't the Condition just additional restriction on this request, but in principle key have to match: `uploads/${filename}` anyway? – Tom Raganowicz Aug 21 '22 at 16:34
  • 1
    This solution is specific to the AWS library for Python. Any solutions for other languages? – G M Sep 25 '22 at 22:55
24

No, they would need to provide an API to allow you to upload multiple files first. This is a limitation of the API, not pre-signing.

See Is it possible to perform a batch upload to amazon s3?.

Community
  • 1
  • 1
poida
  • 3,403
  • 26
  • 26
12

No. A pre-signed URL is valid for only one object.

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
2

First things first: AWS S3 is a key value store, each object aaa/bbb/ccc/ddd/index.html is just one name. There is no concept of "folders" ( even though you might have a false impression that they exists from the UI ).

In order to create a single presigned url for multiple "files" you have to do some preprocessing. Pull all necessary files locally, zip them and put zip archive on S3, then generate presigned URL of zip archive.

Vor
  • 33,215
  • 43
  • 135
  • 193
  • 8
    Technically S3 is a KV store with no folders, but practically speaking "folders" do exist there. You can already set bucket policies [based on key path](https://gist.github.com/magnetikonline/6215d9e80021c1f8de12#getputdelete-access-to-specific-path-within-a-bucket) to essentially set policies on folders. But presigned URLs do not (currently) support partial key paths. And even if they did, it wouldn't help the OP because the HTML application is still linking to resources (css, js at same key prefix) without the presigned URL. – Kasey Speakman Jun 06 '18 at 19:29
0

This would generate a pre-signed URL for the existing object on the S3. In your case you could list down all the objects (using the keyName) and call the above method recursively.

public String generatePreSignedUrl(String bucketName, String keyName){

    Date expiration = new Date();
    long mSec = 60000l;     //Time in millis
    expiration.setTime(mSec);

    GeneratePresignedUrlRequest generatePresignedUrlRequest = new GeneratePresignedUrlRequest(bucketName, keyName);
    generatePresignedUrlRequest.setMethod(HttpMethod.GET);
    generatePresignedUrlRequest.setExpiration(expiration);

    URL url = s3.generatePresignedUrl(generatePresignedUrlRequest);
    return url.toString();
}

PS: The AWS SDK provides API to list down the objects under the bucket & you could pick the required objects.

Vikas Adyar
  • 149
  • 1
  • 8
-1

For anyone else looking at this issue in 2021 - the easiest way that I found to download an entire "folder" of files from an S3 bucket was to change how I uploaded a "folder". Instead of recursively uploading, I swapped to zip the objects into a single zip archive, then upload the zip archive. It doesn't directly answer the question but is a quick work-around. Here is the example python code that I used to get around this problem:

from shutil import make_archive

from boto3 import client

make_archive(name_of_zipped_folder, 'zip', path_to_outdir)

Then upload this zipped folder to s3

aws s3 cp name_of_zipped_folder s3://bucket_name/name_of_zipped_folder

Then use boto3 code to generate presigned urls for that zipped folder

client('s3', 'us-east-1').generate_presigned_url('get_object', Params={'Bucket': bucket_name,'Key': name_of_zipped_folder}, ExpiresIn=120)

Joe Webb
  • 15
  • 1
  • I like your thinking "outside the box", but it's a very static solution that isn't very scalable. For example, if your ZIP file is 10 GB, and you want to change one line of code in some css file, then you have to reupload the entire zip file. Good workaround for small files, though – G M Jul 29 '22 at 16:43