1

I'm setting up a multi-user, multi-server environment. All developers will use Git and clone various repos from GitHub etc. (in one account I control).

Now, how do I get files from GitHub to the servers (about 5 of them)?

First I was thinking of some sort of automated way to push updates from GutHub to the servers, but now I'm thinking I might prefer to run something on the command line of each server to 'pull' the updates from GitHub. This option doesn't seem to be the done way.

Can I not just install Git on a Linux sever (that I use for serving web files) and then use Git pull to pull them in (i.e. just like any computer used for developing)?

How do most people get GitHub files to their web servers?

user2143356
  • 5,467
  • 19
  • 51
  • 95
  • Can you clarify what you mean by files? Are there files in a specific repo on github that you want? Where do you want to put them? What are they for? – Ben Whaley Mar 21 '14 at 21:22

1 Answers1

1

How do most people get GitHub files to their web servers?

Generally by pushing to a bare repo on the web server, with a post-receive hook ready to checkout that repo on the site working tree, similar to:

git --git-dir=/path/to/bare_repo.git --work-tree=/path/to/website/httpdocs checkout -f

You can also pull from the bare repo itself, through a cron job for instance:

*/10 * * * * user /home/usern/git-pull-requests/fetch.sh

But pull or push mean that git is installed on the server.

If you don't want that, you can use git archive to create an archive (zip or tar), and copy over that for a custom script to uncompress it.

VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250