I have a little script that watches files for changes using inotifywait
. When something changes, a batch of files are sent through a process (compiled, compressed, reorganised, etc) that takes about ten seconds to run.
Consider the following example:
touch oli-test
inotifywait -mq oli-test | while read EV; do sleep 5; echo "$EV"; done
If you run touch oli-test
in another terminal a few times, you'll see that each loop completes before it moves on. That scenario is very real to me. If I forget to save a file while it's already processing, or notice a mistake, the events stack up and I'm waiting minutes.
It strikes me that there are two techniques that would make this workflow objectively better. I'm not sure what is easiest or best, so I'm presenting both:
Interrupt previous run-throughs, and restart immediately. The scripted process is currently just an inline set of commands. I could break them out to Bash functions, I'm not wild about breaking them further out than that.
Debounce the list of things waiting to be processed so that if five events happen at once (or while it's already processing), it only runs once more.
(Or both... because I'm certain there are cases where both would be useful)
I am also open to approaches that are different from inotifywait
but they need to give me the same outcome and work on Ubuntu.