0

I'm a super new user to git. I'm trying to clone a 72GiB repository onto my local machine. My internet connection actually has a better bandwidth but git clone, weirdly, is only operating at 583.00KiB/s! Is there a way to boost up this speed?

Also, which portion of the git clone function takes longer - "receiving objects" or "resolving deltas"?

  • 1
    According to [this answer](https://stackoverflow.com/a/4690690/243245), resolving deltas is done on your local disk, so it's only the 'receiving objects' that's actually the download. If you've got a slow download then resolving deltas should be relatively fast, although I can't predict how fast it would be on a 72GB repository - that's unusually large. I've found a 2GB repository painfully slow to work with in the past. – Rup Nov 27 '20 at 21:55
  • Is there no better way you can get a copy, e.g. go and physically get a copy on a memory stick rather than downloading it? – Rup Nov 27 '20 at 21:55
  • Thanks for your comments!! The repositories are SPICE files for spacecraft missions and are hence pretty huge! I can download them separately on my hard disk but I need to constantly update them each month, to keep track of latest changes in satellite positions. That's why I wanted to clone it through git, so that I can use the git pull function later on for easy updation. – vidhyaganeshr Nov 27 '20 at 22:21
  • 1
    I don't know if using git to distribute these files is the right way to go. It would make sense to use git if you needed to _track_ them, because if that's not the case, you are not only getting the last state of the last batch but all history going back in time (that is assuming it's the same files changing over time).... say, I'm a developer and I want to go back to using the files the way they were 2 years ago to test something. Then it makes sense for me to get them through git. – eftshift0 Nov 27 '20 at 22:41
  • This makes perfect sense! I need to verify if it's the same files changing over time. If yes, it makes no sense to use git then. Thanks so much @eftshift0! – vidhyaganeshr Nov 27 '20 at 22:51

1 Answers1

1

git clone, weirdly, is only operating at 583.00KiB/s! Is there a way to boost up this speed?

That depends on where you're cloning from, and the network path between them and you.

Note that some hosting companies deliberately throttle the rate at which anyone can read from and write to their sites, because they're talking to thousands (give or take some orders of magnitude) of other machines at the same time. Their bandwidth isn't free, so they limit how much of it they'll use to talk to you. Sometimes, if you fork over money to them, they'll raise their cap that they use while talking to you.

Also, which portion of the git clone function takes longer - "receiving objects" or "resolving deltas"?

Yes. One or the other usually takes longer. :-)

Seriously, as noted in Rup's comment, resolving happens locally. How long it takes depends primarily on how fast your computer is, how many delta chains there are, how long those chains are. Only one of those is under your control.

The nice thing is that once the clone finishes, you never1 have to clone again. Just use the clone you have, and Git will add new commits to it, without having to obtain the existing commits.


1Well, not if all goes well, anyway. See also the --reference option of git clone.

torek
  • 448,244
  • 59
  • 642
  • 775