I eventually created a solution for myself that is not EXACTLY what I was looking for but does solve the problem for me:
I just introduced to ALL my scripts in the repo a call to a function that just checks if the repo is in sync with the remote. If it is not, it does not let the scripts continue and the script aborts, forcing the users to pull from the remote before running the scripts. So this way, I know no one will be using an out-of-date version of my scripts.
Here is the script I used (I put it in a file called GlobalFunctions.sh within the repo with all the other bash scripts):
#!/bin/bash
#------------------------------------------------------
# GlobalFunctions.sh
# Description:
# This file is supposed to serve as a file for global bash script functions that could be used by multiple files.
#-------------------------------------------------------------
#verify that the repository containing the scripts is up-to-date
function CheckScriptsValidity()
{
# DIRECTORY WHERE THE SCRIPT IS LOCATED
SCRIPT_DIRECTORY="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
# CURRENT DIRECTORY
ORIGINAL_DIRECTORY=$(pwd)
cd $SCRIPT_DIRECTORY
git fetch #update the branch origin/main
if [[ -z $(git diff origin/main..HEAD) ]] # if the current location (HEAD) is at the same location as the remote main branch
then
echo "Scripts Repo is up to date."
else
echo "Scripts Repo is not up to date with remote. Please pull Scripts repo from remote. Aborting..."
read -p "Press enter to continue..."
exit 0
fi
cd $ORIGINAL_DIRECTORY
}
CheckScriptsValidity #Verify that the repository containing the scripts is up-to-date
And within each bash script file I simply added this line on top:
source $(dirname "$0")/GlobalFunctions.sh #Load script file so that we can call its functions (path defined based on this: https://stackoverflow.com/a/42101141/4441211)