Boto3 provides upload process for single csv file...do you know how to iterate and loop through to upload multiple csv files from local directory to AWS bucket?
import boto3
s3 = boto3.resource('s3', aws_access_key_id='xx', aws_secret_access_key='yyy')
BUCKET = "bucketname"
s3.Bucket(BUCKET).upload_file("C:/Users/vrazerDesktop/Projects/Folder/filename.csv", "filename.csv")
I apologize but I'm new to coding. Tried this to aggregate csv files in a list, can't seem to find documentation to apply FOR loop for uploading the csv files from my list into the AWS S3 bucket
import os
import glob
import boto3
s3 = boto3.resource('s3', aws_access_key_id='xx', aws_secret_access_key='yyy')
BUCKET = "bucketname"
path = 'C:/Users/vRazer/Desktop/Projects/Folder'
extension = 'csv' os.chdir(path)
result = glob.glob('.{}'.format(extension))
list = [result]
for i in list:
try:
file = s3.Bucket(BUCKET).upload_file(list, '.{}'.format(extension))
File "", line 16 ^ SyntaxError: unexpected EOF while parsing –