0

I'm running into network issues because of a corporate proxy and large files in my initial repo, because of which I'm unable to clone my repo:

error: RPC failed; curl 56 Failure when receiving data from the peer

All the solutions and workarounds I found are not working, but I had another thought:

I can download a zipped archive of my repo without issues. Is it possible to prepare a repo folder with the zipped contents and then tell GIT to just link it to my repo?

Looking at How to clone git repository from its zip, their solution relies on clone --bare ..., which which still downloads the large contents.

Benny Bottema
  • 11,111
  • 10
  • 71
  • 96

2 Answers2

0

Yes. Just do git remote add with file path to the repo directory.

wilx
  • 17,697
  • 6
  • 59
  • 114
  • A git fetch is still getting everything again from the remote. I'm doing `git init` in the unzipped folder, then `git remote add` followed by a `git fetch`. What am I missing? – Benny Bottema Dec 31 '19 at 09:47
  • Specify the remote you want to fetch from. By default it is the `origin` remote. – wilx Dec 31 '19 at 10:03
  • Yes, I did `git remote add origin ` so it fetches from origin, but it still fetches everything because it's not in my local repo and index, only in my workspace – Benny Bottema Dec 31 '19 at 10:13
0

One answer from How to complete a git clone for a big project on an unstable connection? solved it for me:

Use shallow clone i.e. git clone --depth=1, then deepen this clone using git fetch --depth=N, with increasing N. You can use git fetch --unshallow (since 1.8.0.3) to download all remaining revisions.

I did:

git clone ... --depth=1
git fetch --depth=2
git fetch --unshallow

And now I have everything.


As I can't delete my own question anymore since it has an answer, I voted to close my own question as a duplicate of said question.

Benny Bottema
  • 11,111
  • 10
  • 71
  • 96