1

I'm trying to find a simple way to do a git push from all repositories in a folder simultaneously. The reason is that I have a /var/git with bare repos which I push to over SSH, and on another machine a ~/git with normal repos. Here's what I want to have happen:

  • On my laptop: duncan@laptop:~/git/someproject> git push [url is server:/var/git/someproject]
  • On my laptop: duncan@laptop:~/git/anotherthing> git push [url is server:/var/git/anotherthing]
  • On my server: duncan@server:/var/git> [...push all repos in this folder...]

And that should push all the repos in that folder to their respective remotes. Same goes for fetch in the reverse order.

Dessa Simpson
  • 1,232
  • 1
  • 15
  • 30
  • I haven't really. I had hoped Git had something built into it (It seems it would be a common usage - Git's distributed layout is one of its defining concepts, so you'd think they'd have an easy way to keep a middleman automatically updated to origin. – Dessa Simpson Feb 07 '16 at 04:50

3 Answers3

2

I had hoped Git had something built into it (It seems it would be a common usage)

Not exactly, as Git manages a repo, not repos.

Although there is one configuration where Git is technically managing multiple repos.

On your server, add your repos as submodules in a separate folder.

git init /var/git2
cd /var/git2
git submodule add -b master -- ../git/repo1.git
git submodule add -b master -- ../git/repo2.git
...
git commit -m "add submodules"
git init --bare ../git2bis.git
git remote add origin ../git2bis.git
git push -u origin master

Then, with git 2.7+ on your server (see "Git submodule push"):

cd /var/git2
git config push.recurseSubmodules on-demand

Then, a git submodule update --recurse --remote followed by a git push would push all submodules with new commits.

As you can see, a native way is convoluted, and involves duplicating your bare repos on your server in non-bare clone.

It is better to have a script looping on the bare repos, and pushing if new commits are detected (as in "git ls-remote has different sha1 as the current heads").

Community
  • 1
  • 1
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
1

I'd just use bash alongside git:

cd ~/git
for dir in *; do cd $dir; echo $dir; git push; cd -; done
werkritter
  • 1,479
  • 10
  • 12
  • This belongs on the server so it would be /var/git, but that's beside the point. If I include any files in /var/git (such as the script itself), won't `in *` try to use that too? – Dessa Simpson Feb 07 '16 at 14:51
  • 1
    Well, it will. In that case you'd have to replace the asterisk with something more elaborate. If the only directories in `/var/git` will be repositories, you can use `\`find . -maxdepth 1 -type d\``. Otherwise, you could use `\`find . -maxdepth 2 -name .git | cut -f1-2 -d/\``. – werkritter Feb 07 '16 at 16:58
1

I ended up writing a shell script, as suggested by werkritter's answer. This is my code:

#!/bin/bash
cd /var/git
for repo in $(find . -mindepth 1 -maxdepth 1 -type d); do
        cd /var/git/$repo
        echo $repo
        git fetch
        git push
done

The -mindepth 1 is required in addition to the -maxdepth 1 option in order to prevent find from returning /var/git itself. -type d only returns directories, such as not to try to return the script file.

The reason for there not being an easy or efficient native way is because the purpose of Git is to manage a repo, not multiple repos. There is a native solution using submodules, explained in VonC's answer.

Community
  • 1
  • 1
Dessa Simpson
  • 1,232
  • 1
  • 15
  • 30