There was a similar problem here: Error Argument too long, this should solve your problem.
You should look into xargs. This will run a command several times, with as many arguments as can be passed in one go.
The Solution Basically boils down to this:
On Linux:
ls dir1 | xargs -I {} cp {} dir2/
On OS X:
ls dir1 | xargs -J {} cp {} dir2/
This lists all of the files in dir1 and then uses xargs to catch the lines outputed by ls. Then each file is copied over. (Tested this locally successfully)
If you are curious as to why there is a limit David posted a link under your question (link here for UNIX cp limit).
You can find the limit on your system with:
getconf ARG_MAX
if the files in your directory exceed the value of ARG_MAX the error that you have will be generated.
The link that David mentioned above explains this in great detail and is well worth reading. The summary is that traditionally Unix's limit (as reported by getconf ARG_MAX) was more or less on the cumulative size of:
- The length of the argument strings (including the terminating '\0')
- The length of the array of pointers to those strings, so typically 8 bytes per argument on a 64bit system
- The length of the environment strings (including the terminating '\0'), an environment string being by convention something like var=value.
- The length of the array of pointers to those strings, so typically 8 bytes per argument on a 64bit system