17

I'm attempting to clone a large (1.4GB) Git repository to a 32-bit Debian VM with 384MB of RAM. I'm using Git 1.7.2.5, and using the SSH protocol to clone ('git clone user@host.com:/my/repo')

The clone fails with this message:

remote: Counting objects: 18797, done.
remote: warning: subobtimal pack - out of memory
remote: Compressing objects: 100% (10363/10363), done.
fatal: out of memory, malloc failed (tried to allocate 905574791 bytes)
fatal: index-pack failed

I've tried reducing the amount of memory Git uses to repack on the host repository end, and repacking:

git config pack.windowMemory 10m
git config pack.packSizeLimit 20m
git repack -a -d

My questions are as follows:

  1. Is this a client-size (clone-side) problem or should it be resolved in the repo that I'm cloning from?
  2. In either case, is there anything I can do to make the clone succeed? A lot of the potential solutions online involve some/all of the following things, none of which are acceptable in this instance:

    • changing the contents of the repository substantively (i.e. deleting large files)
    • giving the VM which is doing the clone more RAM
    • giving the VM which is doing the clone a 64-bit virtual CPU
    • transferring out-of-band (e.g. using Rsync or SFTP to transfer the .git directory)

Thanks in advance.

grw
  • 1,279
  • 1
  • 11
  • 19
  • 1
    Hav you checked you have enough disk space (on the cloning machine) to hold the entire packfile (900-something MB)? – Romain Sep 30 '11 at 08:52
  • Romain: Yes - I have at least 5.5GB available on the machine doing the cloning. – grw Sep 30 '11 at 09:18
  • 1
    Yup this bit me once when I had been using large binary files in the repository. Have a look at git-bup for alternative ways if that was the cause – sehe Sep 30 '11 at 12:04

4 Answers4

3

git clone will not look at your pack.packSizeLimit setting, it'll anyway transfer everything in a single pack - unless it changed since the last time I looked.

Using SCP or Rsync might be a way to work around your issue indeed. Removing the "useless" large files, then repacking the repository you try to clone could also help.

Giving more RAM to the VM might also help - I don't think you'll need a 64-bits address space to allocate 900MB... You could also give it enough SWAP space to handle the 900MB package instead of increasing the RAM.

Romain
  • 12,679
  • 3
  • 41
  • 54
  • 2
    Hi Romain, Thanks for your response. I ended up giving the VM 768MB of RAM, which got around the issue for now. Git does seem to eat memory for breakfast though - according to Charon on the #git IRC channel, there may be a solution in upcoming releases of Git, which enable 'streaming decompression', rather than in-RAM decompression. – grw Sep 30 '11 at 11:24
  • 2
    @grw: it eats memory only for certain - bandwidth saving - operations... That's how git becomes fast – sehe Sep 30 '11 at 12:05
  • 1
    I concur on what @sehe says... Git is memory-hungry only when that's been the only way to easily achieve best performance. Now that it's more mature, people are working on making it behave better on more constrained environments... – Romain Sep 30 '11 at 12:35
  • @Romain, I do hope so :D Thanks for your advice, folks. – grw Sep 30 '11 at 12:49
  • Tried adding swap, but no luck. Still can't clone. =/ – Translunar Dec 14 '15 at 18:07
0

I got a similar issue on Windows using MSysGit 32 bits. The git 64 bits from Cygwin did the job. Maybe you should use a 64 bits Debian VM (instead of 32 bits one).

My original answer is available on question Git on Windows, “Out of memory - malloc failed”.

Community
  • 1
  • 1
oHo
  • 51,447
  • 27
  • 165
  • 200
  • Sorry, but that didn't work for me either. Tried to allocate 212211077 bytes and failed – Robbie Smith Sep 29 '15 at 16:09
  • What is your Operating System? (Do you use WindowsXP?) Is your Operating System 32-bits or 64-bits? How have you installed Git? (download or self-compiled?) Is your Git 32-bits or 64-bits? Cheers ;-) – oHo Sep 30 '15 at 09:17
0

Today I had the same issue. Git server ran out of memory, but gitlab reported that there is still memory available. We checked memory with htop (reported none available), restarted gitlab and everything went back to normal.

MateuszL
  • 2,751
  • 25
  • 38
-6
sudo git pull

I faced the same error message every time I pull and sudo git pull actually helped me to overcome this error message and pull was successful.

hg8
  • 1,082
  • 2
  • 15
  • 28
  • 3
    Quick explanation for my down vote: this answer instructs to go against best practices and perform an operation that is dangerous by nature. Seriously, people, don't do this. – Romain Jan 15 '16 at 14:05