0
touch source
$ echo dest.{000000..999999} | tr ' ' '\n' | while read dest ; do echo cp -v source $dest ; done
cp -v source dest.000000
cp -v source dest.000001
cp -v source dest.000002
cp -v source dest.000003
cp -v source dest.000004
cp -v source dest.000005
cp -v source dest.000006
cp -v source dest.000007
cp -v source dest.000008
cp -v source dest.000009
...

Well, this is gonna take forever, mainly because each copy invokes a new cp process.

Let's try with xargs:

$ echo dest.{000000..999999} | xargs -n 1000 cp source
cp: target 'dest.000999' is not a directory

Yeah, right, when giving multiple arguments, cp assumes that n-1 arguments are source files, and the nth argument is a destination directory.

I need a command that works differently:

mycp source dest1 dest2 dest3 ...

How could I achieve this, without invoking a new process for each copy?

Cyrus
  • 84,225
  • 14
  • 89
  • 153
blueFast
  • 41,341
  • 63
  • 198
  • 344
  • @Cyrus this is also a page for bash programmers, who make use of readily available tools without having to reinvent the wheel every tuesday morning. Is the wheel already invented for this particular case? – blueFast Nov 17 '21 at 05:45
  • @Cyrus, not real,y that would still call one process per entry. I have a solution which I am going to post below – blueFast Nov 17 '21 at 07:50
  • @Cyrus why faster? One process still needs to be run for each file – blueFast Nov 18 '21 at 08:14

1 Answers1

0

(based on the suggestion by Cyrus)

This works:

function multi-cp () {
  local source="$1"
  shift
  tee "${@}" < "$source" > /dev/null
}

echo dest.{000000..999999} | xargs -n 1000 | while read -r destinations ; do
    multi-cp source $destinations
done

We use while because xargs can not call functions (there are ways around this, but they have other problems). We still use xargs to split the arguments in manageable chunks.

This assumes that the arguments have no spaces (which is the case, since we are in control).

blueFast
  • 41,341
  • 63
  • 198
  • 344