0

I'm trying to optimize a compression service (my own) on a 104 CPU machine.

In order to do this I'm splitting up video files doing the following

ffmpeg -i test.MOV -threads 3 \
       -vcodec copy -f segment -segment_time 00:05 \
       -reset_timestamps 1 \
       out%02d.MOV

Then I'm compressing each one

for f in ./*MOV; do ffmpeg -i "$f" "./compressed/${f##*/}"; done

But in order for this to be optimized I need to go over the files at the same time as it seems FFMPEG caps out at 2-3 threads.

I tried the following but it doesn't work.

for f in ./*MOV; do (trap 'kill 0' SIGINT; ffmpeg -i "$f" "./compressed/${f##*/}"); done

How can I do this in bash?

Oliver Dixon
  • 7,012
  • 5
  • 61
  • 95
  • You can do it with xargs or GNU parallel, see this answer (and the one before it) https://stackoverflow.com/a/44124618/3833426 – John Dec 10 '22 at 15:17
  • @John thanks for the pointer, can't seem to get this to work inside this loop though – Oliver Dixon Dec 10 '22 at 15:20
  • @OliverDixon - how many files you need to process - will the 104 cores will always be enough for all files in parallel, or do you need to phase the execution at a fix number of concurrent threads, etc ? chatpgt solution works, but you can probably get more mileage using xargs. – dash-o Dec 10 '22 at 15:45

1 Answers1

3

Using xargs parallel execution, possible to achieve the above without having to build job control (e.g. [[ $(jobs -p | wc -l) -ge $parallel_processes ]] in above script) in bash.

ls ./*MOV | xargs -P4 -L1 sh -c 'ffmpeg -i $0 ./compressed/${0##/}'

Also, xargs will take care of properly cancelling outstanding jobs (e.g., ctrl/c, or similar).

You can get more fancy things done with 'parallel' - limiting concurrency based on actual load, etc.

dash-o
  • 13,723
  • 1
  • 10
  • 37