1

I have been trying to access S3 bucket from a python program running on EC2 instance. The code and error is attached:

from boto.s3.connection import S3Connection         
import boto           

conn=S3Connection()            
bucket=conn.get_bucket('nplr1')           

Error:

    Traceback (most recent call last):
  File "Main.py", line 140, in <module>
    main()
  File "Main.py", line 33, in main
    conn.get_all_buckets()
  File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 444, in get_all_buckets
    response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>InvalidAccessKeyId</Code><Message>The AWS Access Key Id you provided does not exist in our records.</Message>

This is my /etc/boto.cfg file

[Credentials]              
aws_access_key_id = 'id'            
aws_secret_access_key = 'key'

[s3]       
region='ap-south-1'           
aws_access_key_id ='id'             
aws_secret_access_key = 'key'   

What is the issue with this? Why am I not able to access the bucket?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
  • I believe 'host' should be 'region'. – hurturk Feb 20 '17 at 15:58
  • Hey @zatta , I tried replacing host with region and I got this error:boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden – Sukeerth Cheruvu Feb 20 '17 at 16:03
  • Is the bucketname correct? If yes, does the keys provided have access to it? – franklinsijo Feb 20 '17 at 16:14
  • Hi @franklinsijo , I edited my question. I am getting the error >The AWS Access Key Id you provided does not exist in our records. Do i have to change anything with AWS config file? – Sukeerth Cheruvu Feb 20 '17 at 16:33
  • Are your keys correct? and why do you have two sets of keys? – franklinsijo Feb 20 '17 at 16:34
  • Both the sets are actually same. With all those errors, I have been trying few things and this is one of those. – Sukeerth Cheruvu Feb 20 '17 at 16:37
  • Check whether the keys are correct and still valid. And you do not have credentials defined in any other file right? can you do `aws configure list` and see this is the only file with keys! – franklinsijo Feb 20 '17 at 16:40
  • Yup, that is the only file containing keys – Sukeerth Cheruvu Feb 20 '17 at 16:52
  • 1
    Check the region if it's correct. There is no ap-south-1 region. I see that there are Asia Pacific (Singapore) ap-southeast-1 apigateway.ap-southeast-1.amazonaws.com HTTPS Asia Pacific (Sydney) ap-southeast-2 apigateway.ap-southeast-2.amazonaws.com HTTPS – Alex Feb 21 '17 at 12:43
  • Both my EC2 and S3 are located in ap-south-1 region. When I tried `aws s3 ls` I got my bucket name as output, In my program too, I am able to list my bucket using get_all_buckets() function but Ikeep getting ResponseError 400: Bad Request when I use get_bucket() method. – Sukeerth Cheruvu Feb 21 '17 at 14:32

2 Answers2

0

Your code is not accessing the [s3] profile within your credentials file. Profiles need to be explicitly requested when creating the client connection. Therefore, the region probably isn't being picked-up.

I recommend you test your credentials via the AWS Command-Line Interface (CLI). For example, running aws s3 ls will test whether you have permission to list your Amazon S3 buckets. If that works, then you can do further commands to test your permissions (eg aws s3 ls s3://nplr1).

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
  • Hey @John , I tried running the command but I got the same error. I tried the command `aws configure` and added the credentials. Then I tried `aws s3 ls` and found my bucket successfully. But when I run the program I get error 400: Bad Request – Sukeerth Cheruvu Feb 21 '17 at 14:15
  • @John There is `ap-south-1` region. It was recently added. – franklinsijo Feb 21 '17 at 18:10
  • Ah! I managed to reproduce your `400: Bad Request` error. See my other answer. – John Rotenstein Feb 21 '17 at 21:54
0

The problem is related to the fact that ap-south-1 uses Signature v4 for signing requests, and boto doesn't recognise to this. Neither does boto3.

However, you can override the signature configuration. Here's some working code in boto3:

import boto3
from botocore.client import Config

s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
bucket=s3.get_bucket('nplr1')

For boto, I managed to get it working by explicitly stating the host for ap-south-1:

from boto.s3.connection import S3Connection         
import boto           

conn=S3Connection(host='s3.ap-south-1.amazonaws.com')            
bucket=conn.get_bucket('nplr1')   

Useful information was gained from:

Community
  • 1
  • 1
John Rotenstein
  • 241,921
  • 22
  • 380
  • 470