We have a machine learning and analytics platform which allows the user to build and combine data pipelines, uni/multivariate analyses, and imported python models. These are stored as config files which can be exported as json or text. Instead of forcing users to manually export files and upload/commit to the remote repo, is it possible to install the git client on the server where the application and config files are hosted to automatically upload/commit updated files to the remote repo?
Asked
Active
Viewed 86 times
0
-
Hi, Did you get a chance to check out below workarounds, how did it go? – Levi Lu-MSFT May 26 '20 at 09:13
1 Answers
0
It is possible to automatically commit and push file changes to remote repo.
If the application is hosted on linux server, you can use inotifywait. You might need a personal access token to to push to azure repo.
inotifywait -q -m -e CLOSE_WRITE --format="git commit -m 'auto commit' %w && git push https://PersonalAccessToken@dev.azure.com/path/to/azurerepo --all" <<file>> | bash
For windows system, you can write a batch file containing the git commands and use scheduled task to run the batch file automatically. Or you can use Directory Monitor as mentioned in this thread. see below example batch file:
cd c:\gitRepoDirectory\
git config --global user.email "you@example.com"
git config --global user.name "username"
git add .
git commit -m autocommit
git push https://PersonalAccessToken@dev.azure.com/path/to/azurerepo --all
Check out below links for more information:

Levi Lu-MSFT
- 27,483
- 2
- 31
- 43