3

I have multiple git repos that I'd like to be able to get info from without having to have a local copy of each one. Space is the main issue.

The repos are on a provider's server or else I would just run the php script local to the server. I'd like to be able to grab the commit history for each repo without having to have a clone of each repo and without having to run a pull or fetch each time.

Is this even possible?

Jake Sellers
  • 2,350
  • 2
  • 21
  • 40
  • Is this what you are looking for? http://stackoverflow.com/questions/13941976/git-log-command-to-check-for-commit-history-on-remote-server – user20232359723568423357842364 Aug 08 '13 at 16:50
  • @user20232359723568423357842364 that is not what the original poster is asking for, those answers all require a `fetch` of remote objects to be able to run `log`, and the OP explicitly states that he doesn't want to have to `clone` and `fetch`. –  Aug 08 '13 at 17:09
  • [`git ls-remote`](https://www.kernel.org/pub/software/scm/git/docs/git-ls-remote.html) can be used to show remote references on the remote (without fetching them?), but I'm not sure if it's possible to view logs without actual commit objects. –  Aug 08 '13 at 17:16
  • you only ever say "grab info", which is horribly unspecific. as Cupcake mentioned, you can get *some* info with `ls-remote`, but unless you specify what exactly you want to know about the repo, we can't give you a definite answer. – Nevik Rehnel Aug 08 '13 at 18:43
  • 1
    @NevikRehnel you quote "grab info" as if it is my post but it is not. I said "get info" and later clarify that I want the commit history. Please read the entire question. – Jake Sellers Aug 08 '13 at 19:20
  • 1
    welp, that's a legitimate reproach. i kinda garbled those two together :< however, in this case the answer is: no, unless you can run scripts on the remote server, you cannot get the commit history without cloning/fetching (this is a very central concept of Git as a DVCS) – Nevik Rehnel Aug 08 '13 at 19:50
  • That's what I was afraid of, guess I'll just have to suck it up and use up some more space. – Jake Sellers Aug 08 '13 at 20:16
  • @JakeSellers you have the option of using a local bare repo (without a checkout), and also, you can control the compression level of repos with [`core.compression`](https://www.kernel.org/pub/software/scm/git/docs/git-config.html) and other compression settings in the repo's `config` file. Also, does the remote repo provider not provide a web interface to view history, like GitHub does? –  Aug 09 '13 at 02:08
  • @JakeSellers, where are you storing your repos? Are you storing them on some commercial service (such as GitHub, BitBucket, Atlassian's stash, etc) or are you storing them on a private versioning server that you maintain? If you store it on a private server, what are you using as a hosting software? Do you have/can you acquire access to that server? – TopherGopher Aug 09 '13 at 22:17

1 Answers1

0

If space is really the issue (and not bandwidth), you can have a script which, for each repos, would:

  • clone it as a bare repo (as Cupcake suggests),
  • do a git log and store its result in a text file,
  • delete the bare repo completely.

You would need each time to clone again those repo to update the "git log" file, but again, if the bandwidth is correct(and the repos aren't huge), that would minimize the disk space, and leaves you at the end with only a collection of "git log" files.

Community
  • 1
  • 1
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250