0

I have one directory that contains thousands of files. I need to copy all the files from that directory to another directory.

I used:

cp -r dir1/* dir2/

but I am getting an error argument too long but it works for directory having less number of files.

codeforester
  • 39,467
  • 16
  • 112
  • 140
SSV
  • 95
  • 1
  • 1
  • 11

2 Answers2

0

There was a similar problem here: Error Argument too long, this should solve your problem.

You should look into xargs. This will run a command several times, with as many arguments as can be passed in one go.

The Solution Basically boils down to this:

On Linux:

ls dir1 | xargs -I {} cp {} dir2/

On OS X:

ls dir1 | xargs -J {} cp {} dir2/

This lists all of the files in dir1 and then uses xargs to catch the lines outputed by ls. Then each file is copied over. (Tested this locally successfully)

If you are curious as to why there is a limit David posted a link under your question (link here for UNIX cp limit).

You can find the limit on your system with:

getconf ARG_MAX

if the files in your directory exceed the value of ARG_MAX the error that you have will be generated.

The link that David mentioned above explains this in great detail and is well worth reading. The summary is that traditionally Unix's limit (as reported by getconf ARG_MAX) was more or less on the cumulative size of:

  • The length of the argument strings (including the terminating '\0')
  • The length of the array of pointers to those strings, so typically 8 bytes per argument on a 64bit system
  • The length of the environment strings (including the terminating '\0'), an environment string being by convention something like var=value.
  • The length of the array of pointers to those strings, so typically 8 bytes per argument on a 64bit system
Community
  • 1
  • 1
Caleb Adams
  • 4,445
  • 1
  • 26
  • 26
-1

If it is not necessary that dir2 already exist, a simple solution could be:

cp -r dir1 dir2

... make sure that dir2 does not exist before you execute the command.

st0ne
  • 4,165
  • 2
  • 23
  • 24
  • The problem is that the files in his directory exceed `ARG_MAX` files which causes the error, not how he is calling `cp`. – David C. Rankin Feb 12 '16 at 08:48
  • You are correct ... that's the problem. And I just gave a possible solution by avoiding globbing. It makes a difference whether you use * or not. ;) – st0ne Feb 12 '16 at 09:11
  • `find /path/to/fies -type f -print0` used with `xargs` is generaly a good solution or a `for file in dir/*; do cp file dir2; done` or a `while read` with *process substitution* can work. – David C. Rankin Feb 12 '16 at 09:45
  • @DavidC.Rankin: yes an other solution ... may you drop an answer with that. – st0ne Feb 12 '16 at 09:58
  • "It makes a difference whether you use * or not" -- indeed it does. The OP wants the *files* in `dir2`, your solution gives him `dir1` (with the files) in `dir2`. If he tries to move the files up from the subdir to `dir2` directly, he's facing the same problem all over again. ;-) – DevSolar Feb 12 '16 at 10:38
  • You should stress the last sentence. It's very important in terms of wether you lost your files if `dir2` already existed... – Sebastialonso Oct 24 '19 at 22:05