219

When I simply run the following code, I always gets this error.

s3 = boto3.resource('s3')
bucket_name = "python-sdk-sample-%s" % uuid.uuid4()
print("Creating new bucket with name:", bucket_name)
s3.create_bucket(Bucket=bucket_name)

I have saved my credential file in

C:\Users\myname\.aws\credentials, from where Boto should read my credentials.

Is my setting wrong?

Here is the output from boto3.set_stream_logger('botocore', level='DEBUG').

2015-10-24 14:22:28,761 botocore.credentials [DEBUG] Skipping environment variable credential check because profile name was explicitly set.
2015-10-24 14:22:28,761 botocore.credentials [DEBUG] Looking for credentials via: env
2015-10-24 14:22:28,773 botocore.credentials [DEBUG] Looking for credentials via: shared-credentials-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: config-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: ec2-credentials-file
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: boto-config
2015-10-24 14:22:28,774 botocore.credentials [DEBUG] Looking for credentials via: iam-role
Smart Manoj
  • 5,230
  • 4
  • 34
  • 59
d-_-b
  • 4,142
  • 6
  • 28
  • 43
  • 5
    Can you post the debug output by adding `boto3.set_stream_logger('botocore', level='DEBUG')` before your code? It will show where it's looking for credentials. – jamesls Oct 23 '15 at 16:07
  • it seems that Boto looks for quite few locations for the credential config file, but apparently does not look into my home directory for some reason... – d-_-b Oct 24 '15 at 05:29
  • 3
    Try setting the environment variable ``HOME`` to point to ``C:\Users\myname`` or setting ``AWS_SHARED_CREDENTIALS_FILE`` to point directly to your credentials file. – garnaat Oct 24 '15 at 13:29
  • 1
    I set the env variable HOME as you described, but now am getting the following error. `botocore.exceptions.NoRegionError: You must specify a region.` *my config file↓ is located in the same folder as my credentails. `[default] ap-northeast-1 ` – d-_-b Oct 24 '15 at 14:55
  • 1
    I was able to fix the problem using [garnaat's comment](https://stackoverflow.com/questions/33297172/boto3-error-botocore-exceptions-nocredentialserror-unable-to-locate-credential#comment54435414_33297172). – Mathieu Dhondt Sep 29 '17 at 07:09
  • Also good to note that if you use amazon's client to input your credentials, it only seems to work for the user that you were at the time. If you run the script as root, but you didn't save the credentials as root, it won't find them. – Andrew Oct 18 '17 at 22:50
  • If you have enough RAM, just load VM and throw in ubuntu, it will save all the python windows path hassle. – mootmoot Jan 11 '18 at 18:30

17 Answers17

168

try specifying keys manually

    s3 = boto3.resource('s3',
         aws_access_key_id=ACCESS_ID,
         aws_secret_access_key= ACCESS_KEY)

Make sure you don't include your ACCESS_ID and ACCESS_KEY in the code directly for security concerns. Consider using environment configs and injecting them in the code as suggested by @Tiger_Mike.

For Prod environments consider using rotating access keys: https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_RotateAccessKey

Muhammad Dyas Yaskur
  • 6,914
  • 10
  • 48
  • 73
SHASHANK MADHAV
  • 1,898
  • 1
  • 9
  • 11
  • This way is useful when handling directly on Django. Thank You. – Joepreludian Jun 16 '17 at 14:01
  • 4
    This is more dangerous as you are putting your secrets in your code which could end up in version control. – nu everest Feb 22 '18 at 23:05
  • 8
    @nueverest This is correct, but you can avoid this by moving the declaration to a settings file and then injecting via environment variables. – Tiger_Mike Mar 16 '18 at 14:24
  • 1
    Though this works, I would say it's not following best practices. – ben jarman Jun 05 '18 at 21:31
  • 1
    Thanks. This can be used as a temp fix in dev setup. Loading these variables from a `.env` file(not committed) would be ideal and would be better than having to pick from `~/.aws/` folder. – SuperNova Aug 01 '18 at 12:29
  • Can someone explain why it is better to save access_id and access_key in .env instead of ~/.aws folder (from boto3 doc)? – haneulkim Feb 27 '22 at 06:08
  • you can exclude the .env file from source code management so it has the correct and different values for all environments (dev-qa-prd) therefore you avoid sharing this info with all the team (or the whole internet if you are public on Github) – Eliezer Garza Mar 31 '22 at 19:54
111

I had the same issue and found out that the format of my ~/.aws/credentials file was wrong.

It worked with a file containing:

[default]
aws_access_key_id=XXXXXXXXXXXXXX
aws_secret_access_key=YYYYYYYYYYYYYYYYYYYYYYYYYYY

Note that there must be a profile name "[default]". Some official documentation make reference to a profile named "[credentials]", which did not work for me.

Roelant
  • 4,508
  • 1
  • 32
  • 62
Fernando Ciciliati
  • 1,219
  • 1
  • 8
  • 2
  • 3
    Works on windows too (C:\Users\User\.aws\credentials) – Mr_and_Mrs_D Jun 18 '17 at 14:26
  • 7
    you can specify which profile to use in boto3 using session = boto3.Session(profile_name=) – Mattia Paterna Sep 11 '17 at 09:42
  • 1
    Using `aws configure` also works if you have aws-cli installed – radtek Aug 03 '18 at 15:24
  • 3
    I was running it via ansible so another thing to look for is if you become a different user while running the command. Make sure you're not doing it with 'sudo' for example or it will try to access roots aws credentials instead and fail if they don't exist. – radtek Nov 15 '18 at 04:17
  • 3
    you don't need to have a default profile, you can set the environment variable AWS_PROFILE to any profile you want (_credentials_ for example) `export AWS_PROFILE=credentials` and when you execute your code, it'll check the AWS_PROFILE value and then it'll take the corresponding credentials from the .aws\credentials file (in this example, it'll search for the _credentials_ profile – Ichigo Oct 14 '21 at 19:51
  • Yes my man :) you saved me – Arar Feb 21 '22 at 22:02
43

If you are looking for an alternative way, try adding your credentials using AmazonCLI

from the terminal type:-

aws configure

then fill in your keys and region.

Amri
  • 1,080
  • 9
  • 16
35

Make sure your ~/.aws/credentials file in Unix looks like this:

[MyProfile1]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey

[MyProfile2]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey

Your Python script should look like this, and it'll work:

from __future__ import print_function
import boto3
import os

os.environ['AWS_PROFILE'] = "MyProfile1"
os.environ['AWS_DEFAULT_REGION'] = "us-east-1"

ec2 = boto3.client('ec2')

# Retrieves all regions/endpoints that work with EC2
response = ec2.describe_regions()
print('Regions:', response['Regions'])

Source: https://boto3.readthedocs.io/en/latest/guide/configuration.html#interactive-configuration.

cjs
  • 25,752
  • 9
  • 89
  • 101
TheWalkingData
  • 1,007
  • 1
  • 12
  • 11
  • 1
    The `output = json` normally is placed in the `~/.aws/config` in a `[profile MyProfile1]` section. It may not work if specified in the `credentials` file instead. – cjs Oct 05 '18 at 03:33
  • @Curt J. Sampson Without checking, I am sure you are right. Thanks for the correction. – TheWalkingData Feb 12 '19 at 16:32
  • I did export AWS_PROFILE=myprofle and it didn't work but this worked. Any explanation on why that might be happenning. – Adarsh Trivedi Jun 01 '20 at 17:55
15

I also had the same issue,it can be solved by creating a config and credential file in the home directory. Below show the steps I did to solve this issue.

Create a config file :

touch ~/.aws/config

And in that file I entered the region

[default]
region = us-west-2

Then create the credential file:

touch ~/.aws/credentials

Then enter your credentials

[Profile1]
aws_access_key_id = XXXXXXXXXXXXXXXXXXXX 
aws_secret_access_key = YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY

After set all these, then my python file to connect bucket. Run this file will list all the contents.

import boto3
import os

os.environ['AWS_PROFILE'] = "Profile1"
os.environ['AWS_DEFAULT_REGION'] = "us-west-2"

s3 = boto3.client('s3', region_name='us-west-2')
print("[INFO:] Connecting to cloud")

# Retrieves all regions/endpoints that work with S3

response = s3.list_buckets()
print('Regions:', response)

You can also refer below links:

Nija I Pillai
  • 1,046
  • 11
  • 13
13

from the terminal type:-

aws configure

then fill in your keys and region.

after this do next step use any environment. You can have multiple keys depending your account. Can manage multiple enviroment or keys

import boto3
aws_session = boto3.Session(profile_name="prod")
# Create an S3 client
s3 = aws_session.client('s3')
Avinash Dalvi
  • 8,551
  • 7
  • 27
  • 53
13

Create an S3 client object with your credentials

AWS_S3_CREDS = {
    "aws_access_key_id":"your access key", # os.getenv("AWS_ACCESS_KEY")
    "aws_secret_access_key":"your aws secret key" # os.getenv("AWS_SECRET_KEY")
}
s3_client = boto3.client('s3',**AWS_S3_CREDS)

It is always good to get credentials from os environment

To set Environment variables run the following commands in terminal

if linux or mac

$ export AWS_ACCESS_KEY="aws_access_key"
$ export AWS_SECRET_KEY="aws_secret_key"

if windows

c:System\> set AWS_ACCESS_KEY="aws_access_key"
c:System\> set AWS_SECRET_KEY="aws_secret_key"
kathir raja
  • 640
  • 8
  • 19
10

Exporting the credential also work, In linux:

export AWS_SECRET_ACCESS_KEY="XXXXXXXXXXXX"
export AWS_ACCESS_KEY_ID="XXXXXXXXXXX"
ahmed meraj
  • 844
  • 1
  • 9
  • 15
6

These instructions are for windows machine with a single user profile for AWS. Make sure your ~/.aws/credentials file looks like this

[profile_name]
aws_access_key_id = yourAccessId
aws_secret_access_key = yourSecretKey

I had to set the AWS_DEFAULT_PROFILEenvironment variable to profile_name found in your credentials.
Then my python was able to connect. eg from here

import boto3

# Let's use Amazon S3
s3 = boto3.resource('s3')

# Print out bucket names
for bucket in s3.buckets.all():
    print(bucket.name)
hru_d
  • 906
  • 10
  • 13
  • 1
    If you set the environment variable on Win10 in the machine section, you will probably need to do a reboot too. – Trevor Apr 18 '19 at 06:48
  • 1
    @Trevor, I tested this on a windows 7 machine with Jupyter notebook, I had to restart the Jupyter server and it worked for me, but I think reboot would be a good idea. – hru_d Apr 22 '19 at 10:44
6

I work for a large corporation and encountered this same error, but needed a different work around. My issue was related to proxy settings. I had my proxy set up so I needed to set my no_proxy to whitelist AWS before I was able to get everything to work. You can set it in your bash script as well if you don't want to muddy up your Python code with os settings.

Python:

import os
os.environ["NO_PROXY"] = "s3.amazonaws.com"

Bash:

export no_proxy = "s3.amazonaws.com"

Edit: The above assume a US East S3 region. For other regions: use s3.[region].amazonaws.com where region is something like us-east-1 or us-west-2

JJFord3
  • 1,976
  • 1
  • 25
  • 40
  • 2
    I had a similar issue - but had to say `no_proxy` for `169.254.169.254` so that the AWS client could get to the metadata service to find the instance profile. – Ralph Bolton Oct 15 '19 at 10:59
  • In fact this works when you are running local setup (DynamoDB) and trying to connect it. I was getting error when running OFFLINE mode without deploying. – Milind Deore May 14 '22 at 10:22
4

If you have multiple aws profiles in ~/.aws/credentials like...

[Profile 1]
aws_access_key_id = *******************
aws_secret_access_key = ******************************************
[Profile 2]
aws_access_key_id = *******************
aws_secret_access_key = ******************************************

Follow two steps:

  1. Make one you want to use as a default using export AWS_DEFAULT_PROFILE=Profile 1 command in terminal.

  2. Make sure to run above command in the same terminal from where you use boto3 or you open an editor.[Understand the following scenario]

Scenario:

  • If you have two terminal open called t1 and t2.
  • And you run the export command in t1 and you open JupyterLab or any other from t2, you will get NoCredentialsError: Unable to locate credentials error.

Solution:

  • Run the export command in t1 and then open JupyterLab or any other from the same terminal t1.
3

In case of MLflow a call to mlflow.log_artifact() will raise this error if you cannot write to AWS3/MinIO data lake.

The reason is not setting up credentials in your python env (as these two env vars):

os.environ['DATA_AWS_ACCESS_KEY_ID'] = 'login'
os.environ['DATA_AWS_SECRET_ACCESS_KEY'] = 'password'

Note you may also access MLflow artifacts directly, using minio client (which requires a separate connection to the data lake, apart from mlflow's connection). This client can be started like this:

minio_client_mlflow = minio.Minio(os.environ['MLFLOW_S3_ENDPOINT_URL'].split('://')[1],
                    access_key=os.environ['AWS_ACCESS_KEY_ID'],
                    secret_key=os.environ['AWS_SECRET_ACCESS_KEY'],
                    secure=False)
mirekphd
  • 4,799
  • 3
  • 38
  • 59
  • in my case I had to set os.environ for all: MLFLOW_S3_ENDPOINT_URL, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY – gndps Jul 27 '22 at 01:27
1

I have solved the problem like this:

aws configure

Afterwards I manually entered:

AWS Access Key ID [None]: xxxxxxxxxx
AWS Secret Access Key [None]: xxxxxxxxxx
Default region name [None]: us-east-1
Default output format [None]: just hit enter

After that it worked for me

Gianmarco G
  • 353
  • 4
  • 9
0

The boto3 is looking for the credentials in the folder like

C:\ProgramData\Anaconda3\envs\tensorflow\Lib\site-packages\botocore\.aws

You should save two files in this folder credentials and config.

You may want to check out the general order in which boto3 searches for credentials in this link. Look under the Configuring Credentials sub heading.

Samuel Nde
  • 2,565
  • 2
  • 23
  • 23
0

If you're sure you configure your aws correctly, just make sure the user of the project can read from ./aws or just run your project as a root

0

I just had this problem. This is what worked for me:

pip install botocore==1.13.20

Source: https://github.com/boto/botocore/issues/1892

Smart Manoj
  • 5,230
  • 4
  • 34
  • 59
0

In case of using AWS

In my case I had to add the following policy in IAM role to allow ec2 tags to be read by the EC2 instances. That would eliminate Unable to locate credentials error :

{
"Version": "2012-10-17",
"Statement": [
    {
        "Sid": "VisualEditor0",
        "Effect": "Allow",
        "Action": "ec2:DescribeTags",
        "Resource": "*"
    }
  ]
}