I am trying to recursively delete a folder in HDFS. Something like: fs.delete(Path, true)
However the folder that I am trying to delete has significantly huge number of files. Is there a way that the deletion folder can be made quick.?
My assumption was that true recursive doesn't iterate over each file and deletes folder in bulk, however that seems not to be the case as I can see files getting deleted one by one.
Please let know your suggestions. I am using scala over EMR-spark and trying to delete files in S3.