I am using a piece of desktop software that does some fairly complex work over numerous files, and using it with the files opened from Dropbox has caused severe data loss problems in the past. (because of Dropbox syncs while the software is running) The files (some text, some binary) are critically important so I want to secure them from damage as much as possible, and I need to use the files from multiple computers and want the ability to go back in history in case they are corrupted, so using git seems to be a perfect fit for this. But I'm fairly new to git itself.
Note: Any given user will only be working on their own files, so there is no scenario in which multiple users would push to a single repo. Each user would have their own files tracked in their own repo in their own instance of Dropbox. The entire point is to have this capability unique to each user so their files are essentially backed up and version controlled via git and Dropbox.
After reading a bit (especially the top two answers here and this great discussion and this article on bare vs non-bare repos) it seems there are a few different approaches with a lot of pro and con discussion, and I'm wondering which approach below would be best in this specific single-user situation?
Basically the options I see are to use Dropbox as a single-user git "server" by having it host a bare repo that I clone locally each time, or to have no metaphorical server and just use Dropbox as a file server with a non-bare git repo that I copy (in full) from/to each time.
Option 1: Bare repo in Dropbox, script clones it to a local working directory: c:\dropbox\master.git
is a bare repo, and the script does a git clone c:\dropbox\master.git c:\working
, runs the software against the checked out files in c:\working
, then does a git add
and git commit
and git push origin master
and deletes c:\working
and then its done.
Option 2: Non-bare repo in Dropbox, script copies it to a local working directory: After copying the entire c:\dropbox\master
directory to c:\working
it runs the software against the files in c:\working
, then does a git add
and git commit
and then overwrites c:\dropbox\master
with the contents of c:\working
and deletes c:\working
and its done.
The final powershell script will also shut down the Dropbox service before it does anything, and start it back up after it commits (option 1) or copies (option 2) back to the master in Dropbox. That part's easy actually and will guarantee Dropbox doesn't change any files while the local repo is being worked on. And since it will be a single-user process, I (or whomever) can wait a few minutes for Dropbox to finish syncing upon first boot.
Option 1 seems more "git-ish", but given the single-user scenario option 2 may be a bit simpler (only 1 repo to deal with instead of 2) but I'm not sure if there are any "gotchas" I'm missing here.
Thanks for any tips/ideas/etc.