The following worked for me.
# read_s3.py
from boto3 import client
BUCKET = 'MY_S3_BUCKET_NAME'
FILE_TO_READ = 'FOLDER_NAME/my_file.json'
client = client('s3',
aws_access_key_id='MY_AWS_KEY_ID',
aws_secret_access_key='MY_AWS_SECRET_ACCESS_KEY'
)
result = client.get_object(Bucket=BUCKET, Key=FILE_TO_READ)
text = result["Body"].read().decode()
print(text['Details']) # Use your desired JSON Key for your value
Further Improvement
Let's call the above code snippet as read_s3.py
.
It is not good idea to hard code the AWS Id & Secret Keys directly. For best practices, you can consider either of the followings:
(1) Read your AWS credentials from a json file (aws_cred.json
) stored in your local storage:
from json import load
from boto3 import client
...
credentials = load(open('local_fold/aws_cred.json'))
client = client('s3',
aws_access_key_id=credentials['MY_AWS_KEY_ID'],
aws_secret_access_key=credentials['MY_AWS_SECRET_ACCESS_KEY']
)
(2) Read from your environment variable (my preferred option for deployment):
from os import environ
client = boto3.client('s3',
aws_access_key_id=environ['MY_AWS_KEY_ID'],
aws_secret_access_key=environ['MY_AWS_SECRET_ACCESS_KEY']
)
Let's prepare a shell script called read_s3_using_env.sh
for setting the environment variables and add our python script (read_s3.py
) there as follows:
# read_s3_using_env.sh
export MY_AWS_KEY_ID='YOUR_AWS_ACCESS_KEY_ID'
export MY_AWS_SECRET_ACCESS_KEY='YOUR_AWS_SECRET_ACCESS_KEY'
# execute the python file containing your code as stated above that reads from s3
python read_s3.py # will execute the python script to read from s3
Now execute the shell script in a terminal as follows:
sh read_s3_using_env.sh