107

I was hoping:

cp -R src/prog.js images/icon.jpg /tmp/package

would yield a symmetrical structure in the destination dir:

/tmp
|
+-- package
    |
    +-- src
    |   |
    |   +-- prog.js
    |
    +-- images
        |
        +-- icon.jpg

but instead, both of the files are copied into /tmp/package. A flat copy. (This is on OSX).

Is there a simple bash function I can use to copy all files, including files specified by wildcard (e.g. src/*.js) into their rightful place within the destination directory. A bit like "for each file, run mkdir -p $(dirname "$file"); cp "$file" $(dirname "$file")", but perhaps a single command.

This is a relevant thread, which suggests it's not possible. The author's solution isn't so useful to me though, because I would like to simply provide a list of files, wildcard or not, and have all of them copied to the destination dir. IIRC MS-DOS xcopy does this, but there seems to be no equivalent for cp.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
mahemoff
  • 44,526
  • 36
  • 160
  • 222

6 Answers6

162

Have you tried using the --parents option? I don't know if OS X supports that, but that works on Linux.

cp --parents src/prog.js images/icon.jpg /tmp/package

If that doesn't work on OS X, try

rsync -R src/prog.js images/icon.jpg /tmp/package

as aif suggested.

ustun
  • 6,941
  • 5
  • 44
  • 57
  • 4
    thanks. turns out "cp --parents" isn't possible on mac, but it's nice to know the flag for other unixen. rsync -R is the simplest portable solution for this problem. – mahemoff Nov 01 '09 at 01:42
  • 1
    I accepted this one for its elegance/memorability, but just discovered this doesn't copy whole directories (on OSX at least), whereas the tar one below does. – mahemoff Jul 23 '12 at 14:21
  • `cp --parents` is illegal option in OSX (BSD cp), but `gcp` (GNU cp) works fine. If it is not in your system yet use `brew install coreutils`. U will have many utils with g-prefix. – kyb Oct 17 '18 at 20:11
  • @mahemoff `cp -R --parents` and `rsync -rR` copies both files and directories relatively. – Vortico Jun 05 '19 at 08:43
24

One way:

tar cf - <files> | (cd /dest; tar xf -)
Randy Proctor
  • 7,396
  • 3
  • 25
  • 26
  • oh, i like this much better than my answer. – EMPraptor Oct 30 '09 at 14:55
  • 5
    You can also use the `-C` option to do the chdir for you - `tar cf - _files_ | tar -C /dest xf -` or something like that. – D.Shawley Oct 30 '09 at 15:05
  • thanks, this is concise, though I prefer rsync for simplicity. – mahemoff Nov 01 '09 at 01:43
  • 1
    Great! Does anybody knows how to turn this into a shell command ? It should accept N inputs, the first N-1 are the files to copy, and the last is the destination folder. – arod Nov 04 '12 at 16:21
  • @arod ${!#} is the last param and use this to get the preceding arguments http://stackoverflow.com/questions/1215538/bash-extract-parameters-before-last-parameter-in. If you write the command, please link to the gist here. – mahemoff Nov 05 '12 at 13:37
  • I might be four years too late, but perhaps something like this: https://gist.github.com/deadbeef404/7f9c4147170d3e5306b5 – deadbeef404 Jan 15 '16 at 03:16
  • cpio or tar ? I think the cpio answer is better because with find you can describe pattern you want to include without having max_size_args error in case your glob produce to many result. Maybe tar can receive a list in inputs to avoid this ? – Et7f3XIV Jun 18 '23 at 22:41
21

Alternatively, if you're old-school, use cpio:

cd /source;
find . -print | cpio -pvdmB /target

Clearly, you can filter the file list to your heart's content.

The '-p' option is for 'pass-through' mode (as against '-i' for input or '-o' for output). The '-v' is verbose (list the files as they're processed). The '-m' preserves modification times. The '-B' means use 'big blocks' (where big blocks are 5120 bytes instead of 512 bytes); it is possible it has no effect these days.

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
  • 3
    It is better to use `-print0` with the combination of `--null` option so that it won't break with special characters and such: `find . -print0 | cpio -pvdmB --null /target` – haridsv Jun 20 '18 at 08:46
  • 1
    Call it old-school, call it portable, call it whatever, but I definitely trust `cpio` for this task. I do agree that the `-print0` and `-null` options should be used, otherwise, at some point, someone will give you some folders with 'special characters' (spaces, most likely) and something will happen. Not that I'm speaking from personal experience, but you might try to back up a bunch of files and only end up with half of them backed up due to spaces being in filenames. (Okay, I'm speaking from personal experience.) – bballdave025 Jan 06 '20 at 16:46
  • @bballdave025: You only run into problems with `cpio` as shown if filenames contain newlines — then you normally end up with an error message about two (or more) filenames not being found for each newline in a filename. (Sometimes, you might get fewer messages — but that requires considerable care in constructing the test case.) . When I used `cpio`, there was no `--null` option; double-dash options weren't part of SVR4 option notations, and the concept of `-print0` wasn't present in `find` either. But that's a _long_ time ago (mid-90s for example. before Linux achieved dominance). – Jonathan Leffler Jan 06 '20 at 16:57
  • Thanks for the details, @Jonathan_Leffler . I love learning stuff here. Now, I'm trying to remember what recursive-copy mistake I made that gave me problems with spaces -- there are plenty of ways I could have made that mistake, – bballdave025 Jan 06 '20 at 21:52
  • @bballdave025— One possibility is using `xargs` in the mix — it splits at white space — blanks, tabs, newlines. OTOH, I'm not sure how or why you'd do that. The GNU [`cpio`](https://www.gnu.org/software/cpio/manual/cpio.html#Copy_002dout-mode) manul on output mode is clear about one filename per line. The SVR4 user manual (printed 1990) is vaguer: _`cpio -o` (copy-out mode) reads the standard input to obtain a list of pathnames and copies those files onto the standard output together with the pathname and status information._ There's a chance, therefore, that it broke names at spaces. – Jonathan Leffler Jan 06 '20 at 22:10
19

rsync's -R option will do what you expect. It's a very feature-rich file copier. For example:

$ rsync -Rv src/prog.js images/icon.jpg /tmp/package/
images/
images/icon.jpg
src/
src/prog.js

sent 197 bytes  received 76 bytes  546.00 bytes/sec
total size is 0  speedup is 0.00

Sample results:

$ find /tmp/package
/tmp/package
/tmp/package/images
/tmp/package/images/icon.jpg
/tmp/package/src
/tmp/package/src/prog.js
Ryan Bright
  • 3,495
  • 21
  • 20
3

rsync of course! tutorial here. and here

Or unison

Aif
  • 11,015
  • 1
  • 30
  • 44
1

Try...

for f in src/*.js; do cp $f /tmp/package/$f; done

so for what you were doing originally...

for f in `echo "src/prog.js images/icon.jpg"`; do cp $f /tmp/package/$f; done

or

v="src/prog.js images/icon.jpg"; for f in $v; do cp $f /tmp/package/$f; done
Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
EMPraptor
  • 642
  • 1
  • 5
  • 11
  • 1
    You don't need `echo` or `$v` here. Also this method will fail if the corresponding directory does not exist in the destination. – Weijun Zhou Nov 20 '18 at 13:04