4

I have a directoy in linux that has several hundred thousand files, and is about 100 Gb. I attempted to clear out the directory using 'rm -f *', and got the following error:

>rm -f *
-ksh: rm: /bin/rm: cannot execute [Argument list too long]

I get the same error when I try the find command. I can delete individual files, and groups if I can get a small enough expression, but that could days to clear them all out. Does anyone know any better ways to empty a large directory?

mount986
  • 65
  • 1
  • 1
  • 10
  • Does the files names have any pattern? So you could write a script to iterate over them. You can write a script to delete them in groups even if they don't. – Rodolfo Jul 26 '15 at 01:46
  • 2
    Actually, it's usually better to delete the entire directory (`rm -rf dir`) and recreate it, than to empty it. – Roman Jul 26 '15 at 01:47
  • It is a duplicate, I did not find that question when searching before I posted. I will try the -xargs suggestion from that thread. – mount986 Jul 26 '15 at 02:04
  • Related: [Does "argument list too long" apply to shell builtins?](https://stackoverflow.com/questions/47443380/does-argument-list-too-long-restriction-apply-to-shell-builtins) – codeforester Nov 23 '17 at 00:47

1 Answers1

4

Try the following:

rm -R -f [your_directory_path]

Then just manually recreate your directory, this way is easier I believe than what you are trying to do:

mkdir [old_directory_name]
ifma
  • 3,673
  • 4
  • 26
  • 38
  • 2
    I did consider this, the problem is I only have permissions on the files inside the directory. The actual directory itself is read only for me. Obviously I could get an infrastructure admin to do it, but not many of those available for a Dev issue on a Saturday evening :) – mount986 Jul 26 '15 at 01:59
  • put a star at the end of the directory name in Ubuntu eg /* – AZ_ Feb 22 '19 at 08:13