0

In my case, I wanted to read all XML form my s3bucket/ parsing then move all parsed files to the same s3Bucker/

for me parsing logic is working fine but I am not able to move all files. this is the example I am trying to use

**s3 = boto3.resource('s3')
src_bucket = s3.Bucket('bucket1')
dest_bucket = s3.Bucket('bucket2')
for obj in src_bucket.objects.all():
    filename= obj.key.split('/')[-1]
    dest_bucket.put_object(Key='sample/' + filename, Body=obj.get()["Body"].read())**

above code is not working for me at all (I have to give s3 folder full access and for testing given public full access as well).

Thanks

user3215858
  • 29
  • 1
  • 8
  • If it's only a one time thing you could use the aws-cli instead of lamdba by using the 'aws s3 sync' command – Priebe Feb 14 '20 at 06:12
  • NO my whole parsing logic is inside lamda and it will trigger once xml updated in IN folder and parse and move the parsed file to OUT folder. And this is not one-time thing, this will for every time and bouk operation based on the Trigger event. – user3215858 Feb 15 '20 at 01:16

1 Answers1

0

Check out this answer. You could use python endshwith() function and pass ".xml" to it, get a list of those files and copy them to the destination bucket and then delete them from the source bucket.

ms12
  • 531
  • 4
  • 10