27

I could really use some help here.

I just created a new bare repo to act as a production target for dev pushes. I also have the working web directory on the server as a git repo. The server is running git 1.7.4.1 on centos5.5

After creating the new repo in the web directory, I performed a git add . It tallied up something like 2300 & some odd files & over 230k insertions.

I did a commit of the newly added file base. Went nice and clean. When I did a git push origin master though, it keeps giving me this (please note, I have 8 CPUs, hence the 8 threads. docs say this is normal);

# git push --mirror
Counting objects: 2000, done.
Delta compression using up to 8 threads.
warning: suboptimal pack - out of memory
fatal: inflateInit: out of memory (no message)
error: failed to push some refs to '/home/ggadmin/gg-prod.git'

I have tried the following things to resolve this, but all yield the same results;

git repack -adf --window-memory=100m
                                ^ tried running this up to 1024m. Same result.

Even tried a force push, but got the same thing, only with a malloc error;

# git push -f origin master
Counting objects: 2000, done.
Delta compression using up to 8 threads.
warning: suboptimal pack - out of memory
fatal: Out of memory, malloc failed (tried to allocate 2340 bytes)
error: failed to push some refs to '/home/ggadmin/gg-prod.git'

I've been working on this for 2 days now and tried just about everything I can find on google and here on SO.

I have reached my wits end with trying to get this fixed. Please tell me that someone out there knows what can be done to make this work?

Andrew Marshall
  • 95,083
  • 20
  • 220
  • 214
Skittles
  • 2,866
  • 9
  • 31
  • 37
  • Just to be sure, this has nothing to do with the `postBuffer`? http://stackoverflow.com/questions/6842687/the-remote-end-hung-up-unexpectedly-while-git-cloning/6849424#6849424 – VonC Mar 05 '12 at 09:00
  • Please explain what you mean, VonC as that is a new term for me with respect to Git. – Skittles Mar 06 '12 at 02:20
  • I was wondering if `git config --global http.postBuffer 524288000` wouldn't be able to make your push work. – VonC Mar 06 '12 at 06:41
  • I can certainly try that. I'm currently at my office, so I'll have to wait until I get home to see if that works. Thanks, VonC! :) – Skittles Mar 06 '12 at 14:44

7 Answers7

23
  1. May be git is suboptimal tool for handling large amount of big blobs.
  2. You can disable multi-threaded compression to save memory: git config pack.threads 1 (in addition to other memory limiting options, like core.bigfilethreshold in newer Git)
Vi.
  • 37,014
  • 18
  • 93
  • 148
  • 2
    Well Vi...git is running slower than molasses running down a pipe in an arctic summer, but it worked. Thank you! – Skittles Mar 06 '12 at 00:02
  • You can consider externalizing big things out of git repo (while still versioning them), or using some other approach for the task. Git is probably trying to find similar blocks in all your data. Try adjusting `core.bigfilethreshold` option (git >= v1.7.6) – Vi. Mar 06 '12 at 01:42
  • Thanks again, Vi! Unfortunately, I'm using v1.7.4.1. But I'll keep that at the top of my Git knowledge items. – Skittles Mar 06 '12 at 02:19
20

The following command fixed the issue for me:

git config --global pack.windowMemory 256m

This affects effectiveness of delta compression so you might want to try a bigger size first, something like 1g, depending on your hardware and bandwidth.

More details here: https://www.kernel.org/pub/software/scm/git/docs/git-pack-objects.html

keremispirli
  • 3,283
  • 2
  • 19
  • 26
10
git config --global pack.threads 1
miguelbemartin
  • 417
  • 3
  • 10
  • 1
    This is the only thing that worked for me. Urgh, I hate still having to deal with old shared servers. –  May 14 '15 at 10:28
3

I had the same issue with a git clone. The repo was 25GB. I used an alternative command, for me it required root control of the source,

rsync -avz -e ssh --progress user@computerName:repo/Directory destination/folder

after this I was able to commit and pull just like any other repository.

Ashitakalax
  • 1,437
  • 1
  • 18
  • 26
1

None of this answers helped me. My problem was that my little server has 1gb of RAM and no SWAP. I maked sudo service apache2 stop & sudo service mysql stop + kill one unused process from htop (after all of that I get ~100mb of RAM) and git push correct.

Vitaly Zdanevich
  • 13,032
  • 8
  • 47
  • 81
1

In my case I had previously reduced my server's virtual memory to nothing in order to remove the paging file so that I could free up the partition and increase the size of my main partition. This had the effect of reducing my working memory, and the result was that git was unable to process large files. After increasing my virtual memory again all was sorted.

MrJedi2U
  • 348
  • 2
  • 7
0

I realise this is a bit late in the game but since some of the above helped me (thanks @Ashitakalax), here are my two cents. Same problem as above (inflateInit: out of memory) when moving changes from a Wordpress dev instance upstream to test, git aborts with out of memory and this is regularly due to changes on the ../uploads/ directory holding image files. All of this in a shared host with no access to the global git config so we do:

0- in repo: git commit -m "some relevant details"

to record the changes

1- rsync -av --progress repo/wp-content/uploads/ test/wp-content/uploads

to move the bulk of the image fixes/changes

2- in test: git add -A

to track the new stuff on the test side of things

3- in test: git fetch origin

now fetch the rest from the repo

4- in test: git merge origin/master

and finally merge...

The rsync bit lightens git load and all's well.

cucu8
  • 891
  • 10
  • 12