Script for daily Uploading files to S3 with specific aws-cli profile permission, after checking the correct user profile upload backup-files to s3
Asked
Active
Viewed 760 times
-1
-
What is your local OS? – Ninad Gaikwad Jul 23 '19 at 10:25
-
I am using Ubuntu 16.04 – Dani Jul 23 '19 at 10:30
-
the awscli `cp` command requires listobjects permission especially if you want to copy multiple files. `sync` command also needs listobject. If you cannot change permissions then it would be best to write your own logic using any of the AWS SDK languages. – Ninad Gaikwad Jul 23 '19 at 10:34
-
I need sample(bash, python etc) for this how to write then i make my own script. – Dani Jul 23 '19 at 10:37
-
Ok python it is. I'll give you an example in python – Ninad Gaikwad Jul 23 '19 at 10:38
-
Python, Node.js – Dani Jul 23 '19 at 10:40
1 Answers
1
import boto3
import os
s3 = boto3.resource('s3')
def upload_to_s3(filepath, bucketname, prefix):
for filename in os.listdir(filepath):
s3.meta.client.upload_file(filepath + filename, bucketname, prefix + filename)
if __name__ == '__main__':
local_file_path = ''
bucket_name = ''
prefix = ''
upload_to_s3(local_file_path, bucket_name, prefix)
You can use something like this to upload all files inside your local backup directory to a bucket with a prefix of your choice.
Since you are on Ubuntu you can use cron job to schedule this script to run daily/weekdays. You can see a simple tutorial for this here.

Ninad Gaikwad
- 4,272
- 2
- 13
- 23
-
https://stackoverflow.com/questions/33378422/how-to-choose-an-aws-profile-when-using-boto3-to-connect-to-cloudfront/46041248 You can use this to create a session and specify user – Ninad Gaikwad Jul 23 '19 at 13:02