I have migrated an old cvs repository (ca 13GB big) to git (ca. 2,7GB big). But I cannot clone the repo with my 32-Bit workstation. I get an out of memory error (malloc failed to allocate ca. 6 MB). Is there a possibility to fix this?
remote: counting objects: 1227276, done.
remote: Compressing objects: 100% (217540/217540), done.
Receiving objects: 100% (1227276/1227276), 2,66GiB | 791.00KiB/s, done.
remote: Total 1227276 (delta 787852), reused 1227276 (delta 787852)
Resolving deltas: 100% (787852(787852), done.
fatal: Out of memory, malloc failed (tried to allocate 6838754 bytes)
fatal: remote did not send all necessary objects
Server: Rhel63 64bit; Wokstation: Win XP 32bit, 2GB RAM; Git: 1.8.3.4 on both.
Update1: Now I have repacked the 2,8GB big pack file on the server into 500MB big pack files. But on client side it has no effect to the clone. It creates only one 2,8GB big pack file. How can I tell the git clone process to user smaller pack-files or to create the pack-files as they are placed on server? - Aninteresting aspect for me: the size of te received objects decreases to 1,5GB.
Update2: Now after some research and analyzing time I think the main problem is that the git clone process cannot handle the one big pack file on my 32-Bit workstation. But how can I configure the cloning process in order to get several pack files with lesser size? On server side it works with the repack
command very well, but it does not effect the client side.
Update 3: Now I have an 1.1GB big bare git repo (by using git gc --aggressive --prune=now
). But the out-of-memory error was still there. So I have tried to split the repo. Thereby I have used git rm
with git commit -a
. So the size of both two new bare repos is equal to the old central one. But now the cloning from the workstation does work. The memory consumption is now constant < 300MB. Before it increases unstoppable.
Now my questions is: Why does the cloning process now finish without problems?