4

The following resolved issues allow me to unload, copy, run queries, create tables, etc in Redshift: Redshift create table not working via Python and Unload to S3 with Python using IAM Role credentials. Note that there is no dependency on Boto3 even though I am succesfully writing to and copying from S3 via Redshift.

I would like to be able to upload a file to S3 dynamically, in Python (from cwd)- however I don't seem to find documentation or examples of how it is possible to do using iam_role 'arn:aws:iam:<aws-account-id>:role/<role_name> rather than access and secret keys as per http://boto3.readthedocs.io/en/latest/guide/quickstart.html.

Any help is greatly appreciated. This is what I have right now, and it throws an error of Unable to locate credentials:

import boto3

#Input parameters for s3 buckets and s3 credentials
bucket_name = ''
bucket_key = ''
filename_for_csv = 'output.csv'    

#Moving file to S3
s3 = boto3.resource('s3')
data = open(filename_for_csv, 'rb')
s3.Bucket(bucket_name).put_object(Key=bucket_key, Body=data, ServerSideEncryption='AES256')
user8834780
  • 1,620
  • 3
  • 21
  • 48

2 Answers2

6

You will need AWS IAM Access Keys.

The issue for you is that you need access keys in order to call STS (Security Token Service) which then can process AssumeRole() with your role ARN which then generates new temporary access keys.

However, if you have access keys then you do not need to use AssumeRole().

If your machine is outside of AWS, then you will need to use access keys, or an authentication / authorization service like Cognito.

IAM Roles are designed for services, such as Redshift, EC2, etc which have permission to call STS with your role ARN to generate new temporary access keys. Roles are not designed to be called outside of AWS (there are exceptions, such as Cognito).

[Edit after new comment]

You have several solutions:

  • Signed URLs. Assign the role to EC2. Then have EC2 create signed URLs that you can use locally to upload files to S3. This keeps the access keys off your system.
  • Use Cognito. Cognito is easy to work with and there are lots of code examples on the Internet. Cognito will provide authentication, authorization and temporary credentials for you.
  • Assign your role to EC2 so that EC2 can upload to S3. Then you have the issue of getting the file to EC2 and paying for the extra bandwidth (EC2 -> S3). You can use SSH and SCP to copy files securely to EC2 and then launch a process to copy to S3.
John Hanley
  • 74,467
  • 6
  • 95
  • 159
  • that's a shame.. What about connecting to EC2 and from there using IAM Role to upload file to S3? I know this is kinda backwards but our Dev Ops won't allow us using access keys no matter what.. – user8834780 Jan 16 '18 at 19:10
  • Since we use Okta, I think I will need to create temp credentials via the AWS CLI, and then reference them to put object into S3 – user8834780 Jan 16 '18 at 20:08
  • That will work. They have a very interesting solution. This is the link that I reviewed: https://support.okta.com/help/Documentation/Knowledge_Article/Integrating-the-Amazon-Web-Services-Command-Line-Interface-Using-Okta – John Hanley Jan 16 '18 at 20:22
2

If you are running this script from an EC2 instance, attach an IAM role to the instance. The IAM role should contain the following policy (in addition to what you already have).

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "statement1",
            "Effect": "Allow",
            "Action":   ["s3:PutObject"],
            "Resource": "arn:aws:s3:::examplebucket/*"
        }
    ]
}

If you are not running this script in an EC2 instance, you need to use the access and secret keys.

krishna_mee2004
  • 6,556
  • 2
  • 35
  • 45
  • thank you for your answer. Unfortunately I am not 100% clear what to do here. I am running script.py via `python script.py` in the terminal. where are you saying I should add the above clause? assume my AIM role is `arn:aws:iam:0000:role/aim_role`, bucket name is `drops` and bucket key is `unload`. If I add this to my script I still get "Unable to locate credentials" error – user8834780 Jan 16 '18 at 18:58
  • Where is the script running in - in your desktop or in an amazon EC2 instance? – krishna_mee2004 Jan 16 '18 at 19:02
  • Desktop which seems to be the issue based on your comment. Is there a way for me to connect to EC2 via Python on Desktop to run the script from "within AWS"? – user8834780 Jan 16 '18 at 19:09
  • IAM roles can be attached to AWS resources only (in this case Amazon EC2 instances). For running this script in your desktop, you need to use access and secret keys only. – krishna_mee2004 Jan 16 '18 at 19:11
  • so this will be a no go? https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_mfa_sample-code.html – user8834780 Jan 16 '18 at 19:19