0

I am trying to run a script which will watch different folder and files and if they change, execute a command. One of the folder paths accepts a dynamic variable which will change based on what is entered when running the script in the CLI, and the other one won't change.

Depending on the file path however, a different action needs to be taken.

#!/usr/bin/env bash

THEME=$1

while true;
do
  echo "inside find"
    find code/themes/$THEME/src -name '*.html' -o -name '*.ts' -o -name '*.scss' | entr -d rsync -avh code/themes/$THEME/*  ./
    find code/base/src -name '*.html' -o -name '*.ts' -o -name '*.scss' | entr -d rsync -avh code/base/*  ./
done

What is happening when I run the script is that it only executes and places the watch on the first find (in the code pasted above's example, the dynamic path only).

Currently my problem is:

  • Only the FIRST find gets called and only a watch placed on the code/themes/$THEME/src paths
  • The second find never gets executed

How do I get both to run at once? Or, how can I write it all in one line?

Also, I have never written scripts or used bash before, so if you have any advice on refactoring or are able to help with my problem, please be gentle with technical explanations.

My main goal is:

  1. I want to set up a watch for any changes in path code/themes/$THEME/src
  2. i want to set up a watch for any changes in path code/base/src
  3. I want both watchers to be running at the same time
Kirsten
  • 171
  • 1
  • 3
  • 13
  • `entr` is apparently from [this answer](/a/38230427/874188) which you were directed to [yesterday.](/questions/56846799/ionic-3-custom-watch-script-for-ionic-watch-live-reload) – tripleee Jul 03 '19 at 05:45
  • I'm not familiar with `entr` but if I'm reading this correctly, it runs the first `find` and `entr` pipeline, then runs the second `find` and sits there waiting for any of the listed files to change. – tripleee Jul 03 '19 at 05:52
  • As an aside, you should use double quotes around `$THEME`, or actually use a lowercase variable name instead (upper case is reserved for system use). Even if you never expect the argument to contain shell metacharacters, the error message if you try to pass in an argument which requires quoting will be bewildering when you use it without quotes. See further [When to wrap quotes around a shell variable?](/questions/10067266/when-to-wrap-quotes-around-a-shell-variable) – tripleee Jul 03 '19 at 05:55
  • @tripleee yes I'm playing around with entr after your mention of a similar issue to my other question. Now facing this issue on multiple finds with different commands based on path. thank you for your advice on the syntax conventions. I will implement that. – Kirsten Jul 03 '19 at 06:09
  • Perhaps you should explain in more detail what you hope should happen. Is there a reason these are executed sequentially? What if you had two `while` loops running independently from each other? – tripleee Jul 03 '19 at 06:11
  • @tripleee in regards to your comment about "it runs the first find and entr pipeline, then runs the second find and sits there waiting for any of the listed files to change", it is only running the first find and when I make changes to any files within that first find it waits and sees the file changes, but any changes in the second find's folders never get registered. – Kirsten Jul 03 '19 at 06:13
  • Because it starts watching for changes when you run it, isn't that it? So any previous changes are ignored because they happened before you ran the second `find`. – tripleee Jul 03 '19 at 06:19
  • And the request to clarify what you are hoping to accomplish still stands. You want to run `rsync` periodically but only if files actually changed? You know you can run `rsync` so that it does nothing if the remote files are up to date? (Still initiates a network connection to the remote etc, of course.) – tripleee Jul 03 '19 at 06:23
  • @tripleee basically I want both of the find commands to get executed at run time of the script to be running simultaneously. at the moment, the first find is running as it is supposed to but the second find never gets actioned. I have tried using two while loops after your suggestion but experiencing same result as the first while loop just keeps running, never actioning the second. I'm not familiar with the rsync functionality you mentioned. This is my first foray into scripting. – Kirsten Jul 03 '19 at 06:27

1 Answers1

1

If your task is simply to make sure the downstream copies of the two (or 1 + number of themes) directories are always up to date, a cron job which syncs them every n minutes might be the simplest and most robust solution. cron has a per-minute granularity, though something like every five minutes may well be sufficient for your use case. rsync by design does not copy files which are already up to date, so it should be reasonably quick to execute in the no-op case.

* * * * * bin/rsyncthemes

(This is once per minute; on Linux you can say */5 in the first field to run every 5 minutes, but this is not portable to all cron dialects.) ... where rsyncthemes contains something like

#!/bin/sh
# cron jobs always start in your home directory,
# but you want to put an absolute path here
# to run this in the directory where you want rsync to copy these files
cd somewhere

for dir in code/themes/one \
    code/themes/other \
    code/base
do
    rsync -avh "$dir"/*  ./
done

where obviously this needs to be marked executable (chmod a+x rsyncthemes) and saved in the directory we specify in the cron job (viz. bin in your home directory).

If you are hellbent on using entr to avoid running rsync needlessly, though, I'm guessing you want something like

for dir in code/themes/one \
    code/themes/other \
    code/base
do
    while true; do
        find "$dir"/src -name '*.html' -o -name '*.ts' -o -name '*.scss' |
        entr -d rsync -avh "$dir"/*  ./
    done & # <-- run each loop as a background job
done

In some more detail, we are running three while true loops as background jobs, one for each of the directories enumerated in the for loop. They are running in parallel with your interactive shell, and will stop running when you log out (unless you separately set them up to stay running with something like nohup). The & operator is what runs something in the background, and putting it after the while loop's done statement makes it apply to the entire loop.

I'm guessing you may need to use entr -n in order for the jobs to run in the background.

The wildcard in the rsync command gets expanded by the shell, so you might need to refactor this to not use a wildcard, or to evaluate the wildcard only after find finishes, if you expect it to be possible for new files to appear in these directories occasionally. (Quoting the entire rsync command and passing it with entr -s should trivially accomplish this, if I'm reading the entr man page correctly.)

tripleee
  • 175,061
  • 34
  • 275
  • 318
  • I implemented the solution with entr and it is working! thank you! will look into some of the things you mentioned and apply them in once I play around with it. thanks again. – Kirsten Jul 03 '19 at 07:28