31

I'm doing some work for a client that has 2 separate AWS accounts. We need to move all the files in a bucket on one of their S3 accounts to a new bucket on the 2nd account.

We thought that s3cmd would allow this, using the format:

s3cmd cp s3://bucket1 s3://bucket2 --recursive

However this only allows me to use the keys of one account and I can't specify the accounts of the 2nd account.

Is there a way to do this without downloading the files and uploading them again to the 2nd account?

Geuis
  • 41,122
  • 56
  • 157
  • 219
  • Ref: copy/move between two different accounts : http://stackoverflow.com/questions/5518205/move-files-directly-from-one-s3-account-to-another – Tej Kiran Jul 29 '14 at 04:57

5 Answers5

47

You don't have to open permissions to everyone. Use the below Bucket policies on source and destination for copying from a bucket in one account to another using an IAM user

  • Bucket to Copy from: SourceBucket

  • Bucket to Copy to: DestinationBucket

  • Source AWS Account ID: XXXX–XXXX-XXXX

  • Source IAM User: src–iam-user

The below policy means – the IAM user - XXXX–XXXX-XXXX:src–iam-user has s3:ListBucket and s3:GetObject privileges on SourceBucket/* and s3:ListBucket and s3:PutObject privileges on DestinationBucket/*

On the SourceBucket the policy should be like:

{
  "Id": "Policy1357935677554",
  "Statement": [{
    "Sid": "Stmt1357935647218",
    "Action": ["s3:ListBucket"],
    "Effect": "Allow",
    "Resource": "arn:aws:s3:::SourceBucket",
    "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}
  }, {
    "Sid": "Stmt1357935676138",
    "Action": ["s3:GetObject"],
    "Effect": "Allow",
    "Resource": "arn:aws:s3:::SourceBucket/*",
    "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}
  }]
}

On the DestinationBucket the policy should be:

{
  "Id": "Policy1357935677555",
  "Statement": [{
    "Sid": "Stmt1357935647218",
    "Action": ["s3:ListBucket"],
    "Effect": "Allow",
    "Resource": "arn:aws:s3:::DestinationBucket",
    "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}
  }, {
    "Sid": "Stmt1357935676138",
    "Action": ["s3:PutObject"],
    "Effect": "Allow",
    "Resource": "arn:aws:s3:::DestinationBucket/*",
    "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}
  }]
}

Command to be run is s3cmd cp s3://SourceBucket/File1 s3://DestinationBucket/File1

Yves M.
  • 29,855
  • 23
  • 108
  • 144
Robs
  • 486
  • 5
  • 2
3

Bandwidth inside AWS does not count, so you could save some money and time by doing it all from a box inside AWS, as long as the buckets are in the same region.

As for doing it without having the file touch down on a computer somewhere - don't think so.

Except:Since they do bulk uploads from hard drives you mail to them, they might do the same for you for a bucket to bucket transfer.

Tom Andersen
  • 7,132
  • 3
  • 38
  • 55
  • 1
    Thanks Tom. Seems that's the way its going. Doing it from an instance seems to be a good way to go too. Thanks for the tip. – Geuis Oct 03 '12 at 02:02
0

I would suggest using cloudberry s3 explorer , as a simple solution to get things moving quickly. Also it allows you to make use of internal aws bandwidth free transfer services.

You can also use the cloudberry sdk tools to integrate into your apps.

Dharman
  • 30,962
  • 25
  • 85
  • 135
JonLovett
  • 92
  • 1
  • 5
-1

Even if Roles and Policies are an really elegant way, I've another solution:

  1. Get your AWS-Credentials for Source-Buckets-Account
  2. Same for Destination-Buckets-Account
  3. On your local machine (Desktop or any Server outside of AWS) create a new profile with the Credentials of the Source-Bucket-Accounts.

    aws --profile ${YOUR_CUSTOM_PROFILE} configure

  4. fill in aws_access_key_id and aws_secret_access_key (you may skip Region and Output)

  5. Save your Destination-Bucket-Credentials as Environment-Variables

    export AWS_ACCESS_KEY_ID=AKI...

    export AWS_SECRET_ACCESS_KEY=CN...

  6. Now do the sync, but add the crucial "profile"-Parameter

    aws --profile ${YOUR_CUSTOM_PROFILE} s3 sync s3://${SOURCE_BUCKET_NAME} s3://${DESTINATION_BUCKET_NAME}

DerKnorr
  • 78
  • 5
  • 2
    Is there anyway I can do S3 sync if the credentials for source bucket and destination bucket are different? – HariShankar Jun 29 '16 at 09:32
  • that does not work for me - An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied It just uses creds from the profile for both source and destination. – ANDgineer Aug 09 '21 at 01:52
-11

Ok, figured out an answer. There are likely other ways to do this too, but this is very easy.

I was able to do this with the s3cmd utility, but can likely be done with similar tools.

When you configure s3cmd, configure it with your account Access Key and Secret Access Key.

Log into the S3 web console using the account of the bucket you are transferring to.

Visit the S3 web console.

https://console.aws.amazon.com/s3/home

Click on your bucket, then Actions, then Properties.

At the bottom under the "Permissions" tab click "Add more permissions".

Set "Grantee" to Everyone

Check "List" and "Upload/Delete"

Save

To transfer, run from your terminal

s3cmd cp s3://from_account_bucket s3://to_account_bucket --recursive

When the transfer is complete, you should immediately visit the S3 console again and remove the permissions you added for the bucket.

There is obviously a security problem here. The bucket we're transferring to is open to everyone. The chances of someone finding your bucket name are small, but do exist.

You can use bucket policies as an alternate way to only open access to specific accounts, but that was too bloody difficult for me to do so I leave that as an exercise to those who need to figure that out.

Hope this helps.

Geuis
  • 41,122
  • 56
  • 157
  • 219
  • 5
    Sorry, -1 for the security hole. Definitely not good practice here - this is a big hack. – swrobel Feb 23 '13 at 00:32
  • Terrible and not required hack. s3 already provides the Bucket Policy functionality for you to create source/destination policies in order to sync source to destination in a safe manner. – Daniel Andrei Mincă Aug 18 '19 at 17:21