7

I want to give read permission to all the files inside a folder in S3 using Java. I am doing folder upload using TransferManager but I did not find any API to set ACL on the directory level

MultipleFileUpload upload = transferManager.uploadDirectory(bucketName, uploadDirectory, new File(folderName), true);

I know that I can set ACL of an S3 object using: s3.setObjectAcl(bucketName, key, acl);

But I want to do it on all files at once in a folder. Is there any way to do it?

Vivek
  • 75
  • 1
  • 9

3 Answers3

20

In S3, there is no such thing as folders, only buckets and keys. Keys that share a common prefix are grouped together in the console for your convenience but under the hood, the structure is completely flat. As a result, there is no way to set the ACL for a folder. But there are some workarounds.

Use a bucket policy

Depending on what permissions you want to grant and to whom, you can grant access rights to all keys in a "folder" using a bucket policy. This example allows anyone to get any key under the folder path/to/folder/ from bucket my-bucket. Here's a list of possible actions and a list of possible principals from the docs.

{
    "Version":"2012-10-17",
    "Statement":[
        {
            "Sid":"SimulateFolderACL",
            "Effect":"Allow",
            "Principal": "*",
            "Action":["s3:GetObject"],
            "Resource":["arn:aws:s3:::my-bucket/path/to/folder/*"]
        }
    ]
}

Iterate and apply the ACL to each key

You can also loop through all the keys and apply the ACL directly like you mentioned using s3.setObjectAcl(bucketName, key, acl). You can filter the keys by the folder prefix so you don't have to check each key name directly. After the directory is uploaded, you can do something like this:

// We only want the keys that are in the folder
ListObjectsRequest listObjectsRequest = new ListObjectsRequest()
                                            .withBucketName("my-bucket")
                                            .withPrefix("path/to/folder/");
ObjectListing objectListing;

// Iterate over all the matching keys     
do {
    objectListing = s3client.listObjects(listObjectsRequest);
    for (S3ObjectSummary objectSummary : objectListing.getObjectSummaries())
    {
            // Apply the ACL
            s3.setObjectAcl(bucketName, key, acl);
    }
    listObjectsRequest.setMarker(objectListing.getNextMarker());
} while (objectListing.isTruncated());
David Morales
  • 870
  • 7
  • 14
  • 3
    This is a quality answer, demonstrating genuine expertise, and the links to the "principle" and "action" resources are extra touches that are exemplary of the kind of thoroughness that I, for one, appreciate. – Michael - sqlbot Nov 25 '15 at 11:34
  • I am facing the same issue in my application and find this answer really useful. Thanks... – Anuruddha Mar 16 '18 at 22:55
  • I have a concern as I am uploading about hundred thousand files and want to set permission for them all. Do you think there would there be a performance overhead? – Anuruddha Mar 16 '18 at 23:11
  • 'ListObjectsRequest(software.amazon.awssdk.services.s3.model.ListObjectsRequest.BuilderImpl)' has private access in 'software.amazon.awssdk.services.s3.model.ListObjectsRequest' – Shai Alon Jul 23 '23 at 13:36
0

You can also copy the files into the same location they already live recursively while granting bucket-owner-full control (or some other --acl option). The files will overwrite themselves and change access as requested.

aws s3 cp s3://myBucket s3://myBucket --recursive --acl bucket-owner-full-control

Source: https://stackoverflow.com/a/37164668/20481215

-1

The following simple method gives public-read to all of the objects in a bucket.

void makeObjectsPublic(AmazonS3 s3Client, String bucketName) {

    final Iterator<S3ObjectSummary> itr = s3Client.listObjects(bucketName).getObjectSummaries().iterator();

    while (itr.hasNext()) {
        s3Client.setObjectAcl(bucketName, itr.next().getKey(), CannedAccessControlList.PublicRead);
    }
}
Turgay Celik
  • 651
  • 5
  • 11
  • You can use for (S3ObjectSummary s3ObjectSummary : s3Client.listObjects(bucketName).getObjectSummaries()) instead of do-while. – Shai Alon Jul 23 '23 at 13:19
  • This will change all files in all folders in the bucket!! that was not what was asked for. – Shai Alon Jul 23 '23 at 13:31