125

I have a long text file with list of file masks I want to delete

Example:

/tmp/aaa.jpg
/var/www1/*
/var/www/qwerty.php

I need delete them. Tried rm `cat 1.txt` and it says the list is too long.

Found this command, but when I check folders from the list, some of them still have files xargs rm <1.txt Manual rm call removes files from such folders, so no issue with permissions.

Alexander
  • 1,493
  • 2
  • 11
  • 8
  • 15
    Even though it's six years later: Would you mindaccepting one of the answers? This'll mark the question as resolved and help other users as well. – MERose May 14 '17 at 11:06

13 Answers13

150

This is not very efficient, but will work if you need glob patterns (as in /var/www/*)

for f in $(cat 1.txt) ; do 
  rm "$f"
done

If you don't have any patterns and are sure your paths in the file do not contain whitespaces or other weird things, you can use xargs like so:

xargs rm < 1.txt
Felix
  • 4,510
  • 2
  • 31
  • 46
nos
  • 223,662
  • 58
  • 417
  • 506
77

Assuming that the list of files is in the file 1.txt, then do:

xargs rm -r <1.txt

The -r option causes recursion into any directories named in 1.txt.

If any files are read-only, use the -f option to force the deletion:

xargs rm -rf <1.txt

Be cautious with input to any tool that does programmatic deletions. Make certain that the files named in the input file are really to be deleted. Be especially careful about seemingly simple typos. For example, if you enter a space between a file and its suffix, it will appear to be two separate file names:

file .txt

is actually two separate files: file and .txt.

This may not seem so dangerous, but if the typo is something like this:

myoldfiles *

Then instead of deleting all files that begin with myoldfiles, you'll end up deleting myoldfiles and all non-dot-files and directories in the current directory. Probably not what you wanted.

matanster
  • 15,072
  • 19
  • 88
  • 167
aks
  • 2,328
  • 1
  • 16
  • 15
  • 1
    "myoldfiles /" or "/ tmp" even more disastrous with -rf. Note the space character next to the "/". – Ray Jun 01 '16 at 12:10
  • 2
    I think this is a dangerous answer, why use recurse if it's a list of *files*, coupled with forced deletion makes this a footgun – CervEd Apr 08 '21 at 18:51
  • @CervEd Because sometimes a man needs to shoot himself in the foot to get the job done. – Joshua Jurgensmeier Aug 16 '23 at 17:09
31

Use this:

while IFS= read -r file ; do rm -- "$file" ; done < delete.list

If you need glob expansion you can omit quoting $file:

IFS=""
while read -r file ; do rm -- $file ; done < delete.list

But be warned that file names can contain "problematic" content and I would use the unquoted version. Imagine this pattern in the file

*
*/*
*/*/*

This would delete quite a lot from the current directory! I would encourage you to prepare the delete list in a way that glob patterns aren't required anymore, and then use quoting like in my first example.

that other guy
  • 116,971
  • 11
  • 170
  • 194
hek2mgl
  • 152,036
  • 28
  • 249
  • 266
  • 4
    This is currently the only answer that handles all of `/My Dir With Spaces/` `/Spaces and globs/*.txt`, `--file-with-leading-dashes`, and as a bonus it's POSIX and works on GNU/Linux, macOS and BSD. – that other guy Jul 06 '18 at 18:13
20

You could use '\n' for define the new line character as delimiter.

xargs -d '\n' rm < 1.txt

Be careful with the -rf because it can delete what you don't want to if the 1.txt contains paths with spaces. That's why the new line delimiter a bit safer.

On BSD systems, you could use -0 option to use new line characters as delimiter like this:

xargs -0 rm < 1.txt
Ray
  • 5,269
  • 1
  • 21
  • 12
  • 1
    On OS X this gets me `xargs: illegal option -- d` – waldyrious Sep 05 '16 at 15:33
  • Good point. It seems there is no other option than `-0` on OS X. – Ray Sep 07 '16 at 10:05
  • 2
    `xargs -I_ rm _` also works in OS X :) see http://stackoverflow.com/a/39335402/266309 – waldyrious Sep 07 '16 at 12:13
  • If using BSD xargs (where there's no `-d`), you can first convert newlines to nulls: `tr '\n' '\0' < 1.txt | xargs -0 rm` – OrangeDog Oct 10 '18 at 15:59
  • 2
    you can also first run `xargs -d '\n' ls < 1.txt` or `xargs -d '\n' stat < 1.txt` to verify what will be deleted and avoid nasty surprises – CervEd Apr 08 '21 at 18:55
  • @CervEd Good idea. I usually do something like this: `xargs -d '\n' echo rm < 1.txt` – Ray Apr 11 '21 at 22:17
  • 1
    @Ray using stat or find you get errors if the files don't exists, you don't have permission etc which I find helpful to verify the paths are correct etc – CervEd Apr 12 '21 at 07:45
20

xargs -I{} sh -c 'rm "{}"' < 1.txt should do what you want. Be careful with this command as one incorrect entry in that file could cause a lot of trouble.

This answer was edited after @tdavies pointed out that the original did not do shell expansion.

Sebastian Barth
  • 4,079
  • 7
  • 40
  • 59
Mark Drago
  • 1,978
  • 14
  • 10
18

You can use this one-liner:

cat 1.txt | xargs echo rm | sh

Which does shell expansion but executes rm the minimum number of times.

tgdavies
  • 10,307
  • 4
  • 35
  • 40
  • As long as any expansion isn't too long? – Douglas Leeder Feb 28 '11 at 13:29
  • 1
    True, a glob could produce an argument list which is too long -- you can add the `-n ` argument to `xargs` to reduce the number of arguments passed to each rm, but that will still not protect you from a single glob which exceeds the limit. – tgdavies Feb 28 '11 at 13:52
6

Just to provide an another way, you can also simply use the following command

$ cat to_remove
/tmp/file1
/tmp/file2
/tmp/file3
$ rm $( cat to_remove )
Quentin
  • 481
  • 6
  • 13
5

cat 1.txt | xargs rm -f | bash Run the command will do the following for files only.

cat 1.txt | xargs rm -rf | bash Run the command will do the following recursive behaviour.

Uzayr
  • 59
  • 1
  • 2
4

In this particular case, due to the dangers cited in other answers, I would

  1. Edit in e.g. Vim and :%s/\s/\\\0/g, escaping all space characters with a backslash.

  2. Then :%s/^/rm -rf /, prepending the command. With -r you don't have to worry to have directories listed after the files contained therein, and with -f it won't complain due to missing files or duplicate entries.

  3. Run all the commands: $ source 1.txt

Evgeni Sergeev
  • 22,495
  • 17
  • 107
  • 124
  • Spaces are not the only characters which need escaping, popular in filenames are also brackets of any kind which can lead to unwanted expansion. – emk2203 Jan 01 '19 at 13:45
1

Here's another looping example. This one also contains an 'if-statement' as an example of checking to see if the entry is a 'file' (or a 'directory' for example):

for f in $(cat 1.txt); do if [ -f $f ]; then rm $f; fi; done
Roan
  • 1,200
  • 2
  • 19
  • 32
0

Here you can use set of folders from deletelist.txt while avoiding some patterns as well

foreach f (cat deletelist.txt)
    rm -rf ls | egrep -v "needthisfile|*.cpp|*.h"
end
Chand Priyankara
  • 6,739
  • 2
  • 40
  • 63
0

This will allow file names to have spaces (reproducible example).

# Select files of interest, here, only text files for ex.
find -type f -exec file {} \; > findresult.txt
grep ": ASCII text$" findresult.txt > textfiles.txt
# leave only the path to the file removing suffix and prefix
sed -i -e 's/:.*$//' textfiles.txt
sed -i -e 's/\.\///' textfiles.txt

#write a script that deletes the files in textfiles.txt
IFS_backup=$IFS
IFS=$(echo "\n\b")
for f in $(cat textfiles.txt); 
do 
rm "$f"; 
done
IFS=$IFS_backup

# save script as "some.sh" and run: sh some.sh
Ferroao
  • 3,042
  • 28
  • 53
0

In case somebody prefers sed and removing without wildcard expansion:

sed -e "s/^\(.*\)$/rm -f -- \'\1\'/" deletelist.txt | /bin/sh

Reminder: use absolute pathnames in the file or make sure you are in the right directory.

And for completeness the same with awk:

awk '{printf "rm -f -- '\''%s'\''\n",$1}' deletelist.txt | /bin/sh

Wildcard expansion will work if the single quotes are remove, but this is dangerous in case the filename contains spaces. This would need to add quotes around the wildcards.

Marco
  • 824
  • 9
  • 13