2

We're using Git to synchronize our servers (dev and live) with our development machines. Developers upload via SFTP to the dev server to test features and changes. Then, when everything is finished it is committed and finally pulled into the Git repository.

The problem is that

git commit -a -m 'Message'   # on local machine
git push origin master       # on local machine
git pull origin master       # on server

on the server almost always won't simply merge files, delete or ignore untracked ones. It thinks there are changes and advices to commit them but of course we don't want to! For example after the below cleanup and reset steps there remain two untracked folders which we shall add?!

We tried

git reset --hard HEAD
git clean -f
git fetch

And read through many git guides. But it does not work as it did with

svn up --force

I was adviced to sync the servers from time to time as there could be inconsistencies from failed uploads etc. What's wrong? Can't we work using Git this way? It's to much work to clean up the servers directories all the time.

Edit: I should add that we can't test locally as the web application depencs strongly on the domain and needs a complex setup (Solr).

Would this be better via NFS share instead of FTP?

Powerriegel
  • 595
  • 5
  • 16

1 Answers1

0

You are using GIT incorrectly. You shouldnt have to FTP files to a server to test, you should have your GIT repo handling version control fully with your developers submitting completed features via commit only.

we use this setup very effectively:

  • GIT repo sits on the test server (not in the working web folder).

  • Developers commit to the git repo on whatever branch theyre working on.

  • If the commit branch was master, we use a post-receive hook to force the test server to pull the latest changes into the working web folder.

  • Then when we deploy on our production server, we force the production working web folder to do a pull from master once all testing has been completed.

This is called 'push-to-deploy' and works extremely well.

This way you have full control over what goes up, which branch the test server is working with and can ensure that only fully tested code is deployed to your production server.

Edit: heres a good answer on how to go about setting it up. Its related to node.js, but the principles are the same.

Community
  • 1
  • 1
DevDonkey
  • 4,835
  • 2
  • 27
  • 41
  • you said developers upload to a different folder than the repo. But how do you then sync both folders? On the live server, there will be no upload, just git pull. that should work. Out problem is only the test/dev server. – Powerriegel Mar 18 '15 at 10:37
  • no I said that developers commit to the repo, then if their commit branch was master, a post-receive hook triggers a pull from master on the test server which updates the files. – DevDonkey Mar 18 '15 at 10:40
  • Okay. We don't use branches until now. I dont really understand why we need a custom script to do what we want. Git should be able to do this out of the box as SVN did? – Powerriegel Mar 18 '15 at 10:47
  • you dont use branches? then youre not using GIT properly at all – DevDonkey Mar 18 '15 at 10:49
  • I know but I first want to solve this problem here before using branches. They add some extra complexity.. – Powerriegel Mar 18 '15 at 10:51
  • It it true that git is unable to merge when not using branches? We've worked around the problem now using rsync from a dedicated git folder into web root. That seems to work nice. – Powerriegel Mar 23 '15 at 11:04