0

I have a S3 bucket, the folder structure looks like this:bucket_name/year/month/day, for each day, there are a couple of files in the corresponding folder.

For example, I have 10 files in bucket_name/2021/09/05 and bucket_name/2021/09/04, 10 files in bucket_name/2021/09/06 (current day is 2021/09/06), how can I tell the code to look for files in the historical folders (except bucket_name/2021/09/06 ) and delete them?

how can I add a step to look for the previous folders, hope this makes sense, thanks.

wawawa
  • 2,835
  • 6
  • 44
  • 105
  • 1
    Any reason you want to do this via code, instead of creating a lifecycle rule in the bucket that would delete old files for you automatically? – Mark B Sep 06 '21 at 16:51
  • If by code is a necessity for some reason, you can use `os.system('')` and pass the folder path with `--recursive` else @MarkB's comment should be the ideal way – neilharia7 Sep 06 '21 at 18:58
  • Hi @MarkB It seems like I can apply lifecycle on a s3 prefix path, for my case, I only want to leave the files that are upoaded to the folder with the current day, if I set lifecycle to `expire current version actions` and `permanently delete preious version actions` with `1 day`, will s3 delete files 2 days or 3 days ago? Or it'll only delete the files one day ago? – wawawa Sep 07 '21 at 10:08
  • If you set that lifecycle rule, it will delete all files older than one day. – Mark B Sep 07 '21 at 12:50

0 Answers0