4

Been having an issue with being unable to clone a git repo. It starts to run and then cancels half way through. My current git repo size is 53.7 MB Git version is 1.7.12.4 on server and on my remote.

Error below:

MacBook-Pro:htdocs macbook$ git clone myrepo@mysite.com:~/opt/git/myrepo.git 
Cloning into 'myrepo'...
zcardepo@zcardepot.com's password:
remote: Counting objects: 8888, done.
remote: Compressing objects: 100% (7185/7185), done.
Write failed: Broken pipe267/8888), 1.03 MiB | 1001.00 KiB/s
fatal: The remote end hung up unexpectedly
fatal: early EOF
fatal: index-pack failed

I created a fresh new repo, clones just fine. Once I add my site files to it and push them to the remote though. I can no longer clone from it. But I can pull from it just fine.

I added this with no luck:

[core]
    compression = -1
[pack]
    windowMemory = 10m
    packSizeLimit = 20m

I tried upping these both to way higher values. No luck

I also tried running git gc --aggressive and git gc --prune on the remote repo.

I've seen this post, but mine is no where as large (1g+) Also see a people having issues with the git versions not matching, but this isn't the case ether.

Community
  • 1
  • 1
Josh
  • 3,673
  • 3
  • 20
  • 27

2 Answers2

15

I think the problem you are having is your clone breaks off in the middle every time.

So, instead of cloning a repo again and again from scratch everytime, I would suggest you instead do a fetch on a freshly created repo.

Basically, initialize an empty repository

cd repo_name && git init

Add the original repo as a remote in this repo

git remote add origin url/to/repo

And now do a git fetch.

This way, even if your clone breaks in the middle, fetch will take care to bring in unfetched objects only in next run.


Alternatively, you can check the solutions here and here.

Community
  • 1
  • 1
Anshul Goyal
  • 73,278
  • 37
  • 149
  • 186
  • This method does work. I really am wanting to figure out why this is doing this. It would be easier to give someone a clone link for them to start working off of. – Josh Jul 30 '14 at 15:09
  • 2
    @tdm Did you check the other solutions I mentioned [here](http://stackoverflow.com/a/22317479/1860929) and [here](http://stackoverflow.com/a/2505821/1860929)? The simplest reason could be network congestion breaking off the clone operation in the middle, thus causing you to start over from ground again. – Anshul Goyal Jul 30 '14 at 17:27
  • 1
    Missed these. Answer to Link 1) `git clone --depth 1` and `git fetch –-depth=2147483647` both work with a clone, when I do a `git pull --all`, it says it's all up-to-date. I noticed only my original master branch is shown on my remote repo. Not my develop branch that I have nearly everything in. Like it can't find my develop branch, which seems to be the issue causing the network congestion. – Josh Jul 30 '14 at 19:52
  • Answer to Link 2) I went back to git 1.7.1 on the server and my machine. Still the same issue persists. I re-tried the steps in Link 1 that you supplied also while running on 1.7.1. – Josh Jul 30 '14 at 19:53
  • @tdm How are you checking the branches in the remote repo? Is it a gihub/bitbucket/gitlab repo? – Anshul Goyal Jul 30 '14 at 19:59
  • I do everything in terminal. I needed to merge my branch `develop` into `master`. I don't know why my `develop` branch won't show in a fresh clone, I'll worry about that later. --- Now it's doing the same issue as before. `git pull` in my new git clone on `master`, it starts to pull and then stops with the same errors as before. The commit I am grabbing is after I added my Drupal CMS files, around 45m total. – Josh Jul 30 '14 at 20:28
  • 1
    Is 45m the size of a single commit? Thats pretty huge, IIRC, github has a limit of maximum 50MB size for commit. If you have a single commit that big - that should explain why the fetch/pull/clone gets interrupted every time. My suggestion would be to trim down that commit. As to how to actually bring down the size of an existing commit.. - I think you will have to ask another question. – Anshul Goyal Jul 30 '14 at 20:39
  • @tdm Also, now that I re-read your question, its not about the size of the repo alone, its also about the size of the individual commits. And [here](http://stackoverflow.com/q/22227851/1860929) is the issue I mentioned in earlier comment. – Anshul Goyal Jul 30 '14 at 20:42
  • Yeah I'd figure so. I didn't know there was even a max size limit to a git commit. I'll look into divided up my commit in this case. Thanks again for all the help. – Josh Jul 30 '14 at 20:43
  • after git fetch, using git pull origin master will get all the files – X.C. Nov 25 '20 at 17:35
0

Also try increasing your postBuffer

git config --global http.postBuffer 1048576000