2

I have an aws setup that requires me to assume role and get corresponding credentials in order to write to s3. For example, to write with aws cli, I need to use --profile readwrite flag. If I write code myself with boot, I'd assume role via sts, get credentials, and create new session.

However, there is a bunch of applications and packages relying on boto3's configuration, e.g. internal code runs like this:

s3 = boto3.resource('s3')
result_s3 = s3.Object(bucket, s3_object_key)
result_s3.put(
                Body=value.encode(content_encoding),
                ContentEncoding=content_encoding,
                ContentType=content_type,
        )

From documentation, boto3 can be set to use default profile using (among others) AWS_PROFILE env variable, and it clearly "works" in terms that boto3.Session().profile_name does match the variable - but the applications still won't write to s3.

What would be the cleanest/correct way to set them properly? I tried to pull credentials from sts, and write them as AWS_SECRET_TOKEN etc, but that didn't work for me...

Philipp_Kats
  • 3,872
  • 3
  • 27
  • 44

2 Answers2

0

Have a look at the answer here: How to choose an AWS profile when using boto3 to connect to CloudFront

You can get boto3 to use the other profile like so:

rw = boto3.session.Session(profile_name='readwrite')
s3 = rw.resource('s3')
Nathan Williams
  • 764
  • 1
  • 6
  • 16
  • Thanks Nathan. I am afraid I cannot do that because boto3 is used within the application I use (great expectations) which I ran via terminal, and I hope I won’t need to maintain custom copy – Philipp_Kats Nov 27 '20 at 21:56
  • 1
    Maybe something like this can help: https://ripon-banik.medium.com/aws-assume-role-script-4e7d3b548f20 Look at example 1, put that in a bash script and source it before you call your application (You will need JQ installed, and you can ignore the last 3 lines of the script if you only need it for this one use case) – Nathan Williams Nov 27 '20 at 23:29
0

I think the correct answer to my question is one shared by Nathan Williams in the comment.

In my specific case, given that I had to initiate code from python, and was a bit worried about setting AWS settings that might spill into other operations, I used the fact that boto3 has DEFAULT_SESSION singleton, used each time, and just overwrote this with a session that assumed the proper role:

hook = S3Hook(aws_conn_id=aws_conn_id)  
boto3.DEFAULT_SESSION = hook.get_session()  

(here, S3Hook is airflow's s3 handling object). After that (in the same runtime) everything worked perfectly

Philipp_Kats
  • 3,872
  • 3
  • 27
  • 44