4

I work in several projects, with other people, that evolve according to requests from our clients. More often than not, I'm completely submerged in my current work and do not get updates about what is going on in the rest of the projects. So I do a "complete fetch" each morning so I don't get completely lost. I have worked out a shell script that does that for me, so it changes to each project's directory, show where it's at, and fetches (no pull) and returns me to wherever I was (beware, it is all one line, just broke it down to help reading):

$ pushd . && cd /W/Git/project1 && pwd && git fetch 
&& cd /W/Git/project2 && pwd && git fetch 
&& cd /W/Git/project3 && pwd && git fetch && popd

I just read a bit about remote. that would be updated with git fetch:

git fetch can fetch from either a single named repository or URL, or from several repositories at once if is given and there is a remotes. entry in the configuration file.

So I think git fetch remote.<group> would very well fit the purpose of my monstrous command line, simplifying maintenance and memory. But I could not find direct reference at how to define such a remote.<group>. Anyone can give me a command line example?

manuelvigarcia
  • 1,696
  • 1
  • 22
  • 32
  • 2
    "*I think git fetch remote. would very well fit the purpose of my monstrous command line*" I don't think so. You cannot update 3 different projects with one fetch anyway so no group will help you. – phd Dec 03 '19 at 14:17
  • @phd I keep reading and reading and trying different combinations... and I am coming to the same conclusion... I'd say, go ahead and post your comment as solution. – manuelvigarcia Dec 04 '19 at 15:24

3 Answers3

5

One git fetch command is working on one git project. The documentation says that git fetch allows fetching multiple remotes but it is only for one project. By default if fetches remotes synchronously (and from git 2.24 it can be done in parallel) but it still works on one git project.

If you want to fetch from multiple project in parallel they you really need to run separate git fetch commands. I suggest that instead of cd into git project you just invoke git command with -C parameter then you can invoke this command from any working directory.

So you example in question could look like this:

$ git -C /W/Git/project1 fetch && git -C /W/Git/project2 fetch && git -C /W/Git/project3 fetch 

You can make it parallel but how to do it depends on the shell you are using. I'm not that great in bash but this is my try with bash's & to execute command in background and wait built in to wait for all of them to finish:

git -C /W/Git/project1 fetch -v & git -C /W/Git/project2 fetch -v & git -C /W/Git/project3 fetch -v & wait

Here three git fetch -v commands are executed in background and you wait for all of them to finish. I added -v verbose parameter to better see what is executed.

You can also try to do it in a loop:

for project in /W/Git/project1 /W/Git/project2 /W/Git/project3 ; do echo "fetching for $project"; git -C $project fetch & done ; wait

Bonus: How do I define a git 'remote.' for fetching?

This answers your question in title. You now know that is not what you are looking for because defining remote.<group> works only for one git project. But I actually found your question because I was searching how to define this remote.<group> config and found nothing. So maybe someone will find it useful.

To define named remote groups edit your git config (open .git/config or execute git config --edit) and add something like this:

[remotes]
    group1 = remote1 remote2 origin
    group2 = remote55 remote66

Remotes in group are separated by whitespace. Now you can use this groups name like git fetch group1 or even fetch remotes in parallel from multiple groups like git fetch --multiple -j4 group1 group2.

Community
  • 1
  • 1
Mariusz Pawelski
  • 25,983
  • 11
  • 67
  • 80
1

As of Git version 2.38.1, the git-for-each-repo command can be used to run a git command on multiple repositories that you have cloned locally. This isn't as flexible as the one-liner suggested elsewhere, but it allows you to set the list of repositories in your git configuration so that you don't need to enter the list into the terminal commands every time; if you have the commands in a shell script, the advantages of git-for-each-repo start to diminish. If configured in your global .gitconfig file (in your home directory), you don't even need your current working directory in Bash to be in a git repository when you invoke for-each-repo.

As of this writing, the documentation (https://git-scm.com/docs/git-for-each-repo) says that the git-for-each-repo command is "EXPERIMENTAL" and may change in the future.

I use this to keep track of some Perl modules I'm interested in - I cloned the repositories from GitHub to my computer, and then defined a multi-valued config multi-valued variable with the paths of each repository:

git config --global repositories.perl.modules "C:/Software/Perl/Modules/p5-Git-Raw"
git config --global --add repositories.perl.modules "C:/Software/Perl/Modules/perl-Archive-Zip"

In my global .gitconfig file, that created the following:

[repositories "perl"]
    modules = C:/Software/Perl/Modules/p5-Git-Raw
    modules = C:/Software/Perl/Modules/perl-Archive-Zip

The actual name of your multi-valued variable can be whatever you want. The documentation suggests: "maintenance.repo" Once configured, I can call git-for-each-repo to update these repositories:

git for-each-repo --config=repositories.perl.modules fetch --verbose –prune
git for-each-repo --config=repositories.perl.modules pull

So far I've only figured out how to run one git command at a time using this method - a second git command requires another invocation of git-for-each-repo.

Andrew M
  • 71
  • 1
  • 1
  • 5
0

I think git fetch remote. would very well fit the purpose of my monstrous command line

I don't think so. You cannot update 3 different projects with one fetch anyway so no group will help you.

phd
  • 82,685
  • 13
  • 120
  • 165