54

When cloning a repository from GitHub I sometimes only get a download rate between 50-100 KiB/sec (staying stable) while most of the time I have about 10 MiB/sec. When cloning the same repository from a different machine (= different global IP) I get full speed.

Does GitHub impose a rate limit on repository cloning? The repository in question is quite big (~100 MiB) and I clone it about twice a day.

ooxi
  • 3,159
  • 2
  • 28
  • 41
  • 2
    im curious why you would be 'cloning' a repo more than once, on a single machine -- you can always check gitHub status page to see if everything is operational - https://status.github.com/ – chrismillah Apr 14 '15 at 17:53
  • @cjm628 The checkout is done automatically by an integration server pulling all dependencies as submodules. – ooxi Apr 15 '15 at 05:32
  • 1
    You can clone the repo to another provider and do checkouts from there. And if you want to keep GitHub for some reason, you can just do a push from time to time. – Maciej Łoziński Oct 13 '17 at 17:56
  • you should use caching, and just git pull – caub Oct 15 '17 at 18:38
  • @caub this does not work in an ephemeral environment, present on CI – ooxi Oct 15 '17 at 20:49
  • @ooxi travis can cache, for example (and I always cache node_modules, it makes builds much faster), I bet most can – caub Oct 16 '17 at 06:37
  • Just saying that I have exact same problem. Even https from github works fast. Also virtual guests work fine. But not the host OS, oh no. I constantly get 150KiB/s. – Miha Markic Oct 09 '18 at 16:58
  • `Receiving objects: 15% (210/1368), 3.33 MiB | 5.00 KiB/s ` Internet Speed test: 50MiB/s. Gihub is slow. – run_the_race Mar 13 '22 at 19:44

4 Answers4

26

I found a solution that worked very well for me.

Go to github and copy the link to the clipboard. Then open a web proxy website (https://www.proxysite.com worked for me). And paste the link (I tried with US1) - instead of downloading 670mb in 1+ hour at least, it took less than 2 minutes.

Works like a charm!!

Nils Tierecke
  • 348
  • 5
  • 8
9

Do you have massive binaries committed in the repos? That might do it.

Otherwise, look at optimizing your CI's behavior. Instead of:

git submodule update [--recursive]

You want:

git submodule update [--recursive] --depth 1

CI doesn't need the whole repo history, just the target state. More details here: Git shallow submodules

Community
  • 1
  • 1
Joe Atzberger
  • 3,079
  • 1
  • 18
  • 16
  • The repository I was cloning does not contain large binaries nor does it have submodules. But for the CI a shallow clone is what you want, it reduced my download form 110 MiB to 7 MiB! – ooxi Jul 18 '15 at 15:51
  • 9
    Great suggestion, but it doesn't answer the question about why the download speed would vary between computers. (I've noticed that as of this week github is slow for me at work, but fast from other ip addresses, which I can test remotely). – geneorama Mar 17 '16 at 17:14
4

Was having the same issue both at office and home, two different IPs. Just restarted my machine and download speed is back to normal.

Sabir Ali
  • 475
  • 2
  • 16
0

Try gitclone.com and use --depth = 1 at the same time. See gitcache for the implementation

sɐunıɔןɐqɐp
  • 3,332
  • 15
  • 36
  • 40
simon gao
  • 26
  • 1
  • 2
    From Review: Hi, unfortunately your post does not seem to answer the question. Please describe why the proposed solution solves the issue, otherwise this post should be a comment and not an answer. As it is now, it looks just like a link to a third-party solution. Link-only answers should be avoided in S.O. Please check the S.O. rules: [How to Answer](https://stackoverflow.com/help/how-to-answer) – sɐunıɔןɐqɐp May 31 '20 at 09:21