0

I have the following script I made from looking at Execute command on all files in a directory:

find /home/user/test/* -maxdepth 1 -type f -name '*.conf' -exec /home/user/program --config "{}" \;

I have a bunch of .conf files in /home/user/test/ and I would like the program to run on all of these at once. The program is to simulate network traffic so runs constantly until stopped using Ctrl+C.

If I manually open new screen sessions and run the command on each of the .conf files separately it works fine however I am trying to figure our a way to only run the command once on all the .conf files.

The script I wrote should apply to all files but it only seems to load just one. Any idea what I am doing wrong?

2 Answers2

1

The program is to simulate network traffic so runs constantly until stopped using Ctrl+C.

There's your problem. find does not do parallelism or run anything in the background; it runs the program and waits for it to complete before moving on to the next file.

One option is to invoke the command indirectly through an sh shell command line, which allows you to put it in the background with &:

find ... -exec /bin/sh -c "/home/user/program --config {} &" \;

The drawback is that you'll have to manually find and kill each spawned program if you want to stop them, or use something like killall.

Another option would be GNU parallel, which does give you better control over your jobs.

A third option, if you can modify /home/user/program, is to implement a flag like --background or --daemonize which tells it to go into the background straight after launch.

Thomas
  • 174,939
  • 50
  • 355
  • 478
  • Gotcha thanks for your help! Fortunately the program already has a daemonize flag which I can apply in the find script. If I apply this it should run in the background and can be killed once and restarted correct? – Davearoo 32 Oct 14 '22 at 12:16
  • I would expect so, but it depends on the program. – Thomas Oct 14 '22 at 12:22
0

find looks in all of the paths listed as arguments. Commonly, only one directory is given, but it is valid to specify multiple arguments. For example, find . -print will list all names in and under the current working directory, and find alice bob -print will look in and under alice and bob. If either alice or bob is itself the name of a regular file, that name will simply be printed. In your case, you are using the shell to expand a * glob and executing find on all of the names that match the glob /home/user/test/*. That glob does not match any names in which the final component starts with . (aka, "dotfiles" or "hidden files"), so you will not see any of those. You want to omit the glob and just give a single path argument to find:

find /home/user/test -maxdepth 1 ...

so that find will look at everything in and under /home/user/test

William Pursell
  • 204,365
  • 48
  • 270
  • 300
  • Thanks for the details. I modified the script to the below however I am still having the same trouble. It only seems to pick one of the .conf files and runs the program on that and not the others in the dir `find /home/user/test -maxdepth 1 -type f -name '*.conf' -exec /home/user/program --config "{}" \;` – Davearoo 32 Oct 14 '22 at 11:52
  • In the question you say "run on all of these at once". Are you expecting these to run in parallel? With `find`, they will execute serially. You probably want to use a tool like `parallel`, but perhaps it is sufficient to do `find ... -exec sh -c '/home/usr/program --config {} &' \;` or similar. – William Pursell Oct 14 '22 at 11:56
  • Yeah I need it to run in parallel, I just tried your find script mod and it is now working perfectly with `find ... -exec sh -c '/home/usr/program --config {} &' \;` One last query, as that script has executed and is running in the background how would I able to stop it? – Davearoo 32 Oct 14 '22 at 12:06