19

I am using boto3 in aws lambda to fecth object in S3 located in Frankfurt Region.

v4 is necessary. otherwise following error will return

"errorMessage": "An error occurred (InvalidRequest) when calling 
the GetObject operation: The authorization mechanism you have 
provided is not supported. Please use AWS4-HMAC-SHA256."

Realized ways to configure signature_version http://boto3.readthedocs.org/en/latest/guide/configuration.html

But since I am using AWS lambda, I do not have access to underlying configuration profiles

The code of my AWS lambda function

from __future__ import print_function
import boto3


def lambda_handler (event, context):
    input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
    input_file_key = event["Records"][0]["s3"]["object"]["key"]
    input_file_name = input_file_bucket+"/"+input_file_key

    s3=boto3.resource("s3")
    obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
    response = obj.get()
    return event #echo first key valuesdf

Is that possible to configure signature_version within this code ? use Session for example. Or is there any workaround on this?

Hello lad
  • 17,344
  • 46
  • 127
  • 200

3 Answers3

30

Instead of using the default session, try using custom session and Config from boto3.session

import boto3
import boto3.session
session = boto3.session.Session(region_name='eu-central-1')
s3client = session.client('s3', config= boto3.session.Config(signature_version='s3v4'))
s3client.get_object(Bucket='<Bkt-Name>', Key='S3-Object-Key')
omuthu
  • 5,948
  • 1
  • 27
  • 37
  • Is there a way to configure this from a file? I'm asking because I'm using a piece of code where `boto3` is the dependency, so I don't have direct access to change the `client()` call. – bstempi Oct 28 '16 at 20:01
  • you can set boto3.session.Session(profile_name='profile1') where profile1 is the name of the profile defined in .aws/credentials file with AWS keys, tokens, desired region and other necessary params – omuthu Oct 28 '16 at 22:24
  • What if I'm not using AWS keys and I'm instead relying on the EC2 instance's metadata service? – bstempi Oct 30 '16 at 15:58
  • then just go create a client without specifying keys or any other param, s3client = boto3.client() – omuthu Oct 31 '16 at 21:32
6

I tried the session approach, but I had issues. This method worked better for me, your mileage may vary:

s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))

You will need to import Config from botocore.client in order to make this work. See below for a functional method to test a bucket (list objects). This assumes you are running it from an environment where your authentication is managed, such as Amazon EC2 or Lambda with a IAM Role:

import boto3
from botocore.client import Config
from botocore.exceptions import ClientError

def test_bucket(bucket):
    print 'testing bucket: ' + bucket
    try:
        s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
        b = s3.Bucket(bucket)
        objects = b.objects.all()

        for obj in objects:
            print obj.key
        print 'bucket test SUCCESS'
    except ClientError as e:
        print 'Client Error'
        print e
        print 'bucket test FAIL'

To test it, simply call the method with a bucket name. Your role will have to grant proper permissions.

Andy G
  • 61
  • 1
  • 1
0

Using a resource worked for me.

from botocore.client import Config
import boto3
s3 = boto3.resource("s3", config=Config(signature_version="s3v4"))
return s3.meta.client.generate_presigned_url(
    "get_object", Params={"Bucket": AIRFLOW_BUCKET, "Key": key}, ExpiresIn=expTime
)
Ashish Cherian
  • 367
  • 1
  • 3
  • 15