0

I need to delete all files within a folder in Amazon S3 using Pyspark.This is the code


s3 = boto3.resource('s3')
bucket = s3.Bucket('sales')
boto3_prefix = "build/subdomains/sim/users/ongoing"

for objects in bucket.objects.filter(Prefix=boto3_prefix):

    if objects.key != boto3_prefix:
        del_response = bucket.objects.filter(Prefix=objects.key).delete()```

But it deletes the folder 'ongoing' alongwith all the files inside.
Is there a way to only delete the files within 'ongoing' folder
and not the folder itself

user2280352
  • 145
  • 11

1 Answers1

0

I recommend looking through answers along the same lines, like this which recommends using delete_objects

import boto3
from boto3.session import Session

session = Session(aws_access_key_id='your_key_id',
                  aws_secret_access_key='your_secret_key')

# s3_client = session.client('s3')
s3_resource = session.resource('s3')
my_bucket = s3_resource.Bucket("your_bucket_name")

response = my_bucket.delete_objects(
    Delete={
        'Objects': [
            {
                'Key': "your_file_name_key"   # the_name of_your_file
            }
        ]
    }
)