I'm using the script .git/hooks/pre-commit to take snapshots of a MySQL database along with other project files. Here are the contents:
#!/bin/bash
mysqldump -u user -ppassword --skip-extended-insert dbname > /path/to/repo/dbname.sql
cd /mnt/hoste/Storage/Code/test-repo/
git add /path/to/repo/dbname.sql
If I run:
git commit -am "Message"
A new dump file is created, but then git tells me:
nothing to commit (working directory clean)
Then, if I run the identical commit command again, it works:
1 file changed, 1 insertion(+), 1 deletion(-)
Then, if I run the identical command again, back to "nothing to commit". This pattern continues ad infinitum - works, doesn't work, works, doesn't work. It's not based on time (I've tried commit immediately, and waiting 10 minutes). Mysqldump is certainly updating the .sql file (it date stamps them with hh:mm:ss format near the end of the file).
If I take the script, and run it line by line at the command line (including first cd'ing to the hooks directory to fully simulate running the script), there is a new commit every time. If I put it in an executable file in hooks, or in repo root, and execute it from command line, and then run the commit command, it works every time. It only doesn't work every other time, and then only when it's run from within pre-commit.
What could be causing this weird behaviour?