13

I have a problem with using Python-Boto SDK for S3 Buckets for region Frankfurt. According to Amazon link this region will only support V4. This document explains how to add V4 support for Boto SDK. I have added a new section:

if not boto.config.get('s3', 'use-sigv4'):
    boto.config.add_section('s3')
    boto.config.set('s3', 'use-sigv4', 'True')

and then I have created new connection and got all buckets:

connection = S3Connection(accesskey, secretkey, host=S3Connection.DefaultHost)
buckets = connection.get_all_buckets()

it works fine, but then I tried to get all keys for my bucket:

for bucket in buckets:
    bucket.get_all_keys()

and I got the following:

S3ResponseError: 400 Bad Request
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AuthorizationHeaderMalformed</Code><Message>The authorization header is malformed; the region 'us-east-1' is wrong; expecting 'eu-central-1'</Message><Region>eu-central-1</Region>

Why did it occur? After that I connected to the region and got all needed data:

region_con = boto.s3.connect_to_region('eu-central-1', aws_access_key_id=accesskey, aws_secret_access_key=secretkey)
bucket = region_con.get_bucket(bucket.name)
bucket.get_all_keys()

How can I fix it properly?

Oleg
  • 141
  • 1
  • 1
  • 5
  • What happens if you actually make the change in your boto config file rather than trying to do it programmatically? – garnaat Dec 10 '14 at 13:18
  • Yes, I tried it before, but got the same result. What difference should your approach have made? – Oleg Dec 10 '14 at 13:56
  • Probably none but you are only changing the value of the in-memory config in your environment. If another config was being created somewhere else it would not get the updates because it would be reading the config directly from the config file. I just wondered if that would make any difference. – garnaat Dec 10 '14 at 14:01

4 Answers4

8

I had the same issue using Boto. The region was Frankfurt and got errors about wrong regions. The solution for me was just to point a host (an URI got from this page http://docs.aws.amazon.com/general/latest/gr/rande.html) to 's3.eu-central-1.amazonaws.com' instead of default 's3.amazonaws.com'

s3 = boto.s3.connect_to_region('eu-central-1',
                               aws_access_key_id=accesskey,
                               aws_secret_access_key=secretkey,
                               host='s3.eu-central-1.amazonaws.com')
hsrv
  • 1,372
  • 10
  • 22
2

Try removing s3 from boto config, following code works for me

if 's3' in boto.config.sections(): boto.config.remove_section('s3')

pseudonym
  • 129
  • 1
  • 8
1

hsrv's answer above works for boto 2. For boto3, the following is broadly equivalent:

s3 = boto3.client('s3', region_name='eu-central-1')

Alternatively, you can set the region field in your .aws/config:

[default]
output = json
region = eu-central-1

This sets the default region; you can still pick a specific region in Python as above.

The significance of the region varies from service to service (for example, assuming you're not sat in a VPC, you can access an S3 bucket from anywhere). In this case, however, the important thing is that newer regions (such as Frankfurt) only support the newer authentication scheme (AWS4-HMAC-SHA256). Boto runs into problems if you try to connect to anything in such a region from a region that still uses the old scheme (such as Dublin).

Rob Hague
  • 1,409
  • 9
  • 14
  • It's not an answer. You're going out from Frankfurt, instead of suggesting way to enable V4. – Nikolay Fominyh Mar 17 '17 at 07:00
  • The question was about accessing S3 buckets in Frankfurt using boto. Specifying the Frankfurt region explicitly is one way to do this (enabling v4 authentication is another). – Rob Hague Mar 20 '17 at 09:17
0

for boto2 -- adding this to the .boto config worked

[s3]
use-sigv4 = True
host=s3.eu-central-1.amazonaws.com
storm_m2138
  • 2,281
  • 2
  • 20
  • 18