I have 3 buckets 1.commonfolder 2.jsonfolder 3.csvfolder
.
Common folder will be having both json and csv files
need to copy all csv files to csvfolder
need to copy all json files to json folder
Code is below to get all the files from commonfolder
How to copy after that
import boto3
s3 = boto3.client('s3')
def lambda_handler(event, context):
#List all the bucket names
response = s3.list_buckets()
for bucket in response['Buckets']:
print (bucket)
print(f'{bucket["Name"]}')
#Get the files of particular bucket
if bucket["Name"] == 'tests3json':
resp = s3.list_objects_v2(Bucket='commonfolder')
for obj in resp['Contents']:
files = obj['Key']
print(files)
if(filename.split('.')[1].lower()=='json'):
copyjson(bucket,filename)
#copyjson(jsonfolder,filename)
elif(filename.split('.')[1].lower()=='csv'):
copycsv(bucket, filename)
#copycsv(csvfolder,filename)
need to create a new function copyjson,copycsv to do this job
Need to copy from common-bucket to either csv-bucket or json-bucket depending on the file extension