3

I'm trying to clone a git repository onto my local machine, but it is saying "malloc failed" because the repository is too big.

Does anyone know the exact rsync command I could use to copy over the directory? or is there anything else I can do to clone a big repository?

I tried rsyncing but it gives me the following error:

Invalid command: 'rsync --server --sender -v . username/dir.git'
  You appear to be using ssh to clone a git:// URL.
  Make sure your core.gitProxy config option and the
  GIT_PROXY_COMMAND environment variable are NOT set.
rsync: connection unexpectedly closed (0 bytes received so far) [receiver]
rsync error: error in rsync protocol data stream (code 12) at io.c(463) [receiver=2.6.8]

I get the following error when trying to use git clone:

remote: Counting objects: 52708, done.
remote: Compressing objects: 100% (52188/52188), done.
fatal: Out of memory, malloc failed (tried to allocate 1471836719 bytes)
fatal: index-pack failed
Tekkub
  • 30,739
  • 2
  • 30
  • 20
Chris Hansen
  • 7,813
  • 15
  • 81
  • 165
  • Can you try again your clone command after a `git config --global http.postBuffer 524288000`, as explained in http://stackoverflow.com/questions/6842687/the-remote-end-hung-up-unexpectedly-while-git-cloning/6849424#6849424? – VonC Aug 23 '11 at 21:03
  • @VonC - shouldn't you be asking which protocol is being used in the first place? That setting wouldn't matter if ssh / git is being used? – manojlds Aug 23 '11 at 21:09
  • Do you really use `rsync` to clone it? :? – KingCrunch Aug 23 '11 at 21:41
  • nope git config --global http.postBuffer 524288000 did not work. :( I have edited my question above to clarify the errors I get. thanks! – Chris Hansen Aug 23 '11 at 22:28
  • Just out of curiosity, how big is the repo? – Nic Aug 24 '11 at 03:25
  • @lanojlds: true, I assumed http-based clone. And anyway, the malloc is about the triple of the `postBuffer` size I was suggesting! – VonC Aug 24 '11 at 04:05

3 Answers3

2

Just use a newer version of git. Newer version can handle this. or, if you are using new version already, set git config core.bigFileThreshold to a smaller size.

J-16 SDiZ
  • 26,473
  • 4
  • 65
  • 84
0

Try using ulimit to allow more memory to be used by the git process.

  • ulimit -m XXX
  • ulimit -v YYY
Tom
  • 11
0

If you have one really large file, I don't think you have any other option but trying to remove it from your repo, assumming:

  • you have access to the remote server
  • nobody else has managed to clone it (since it would change the history in the process, which would make any future pull problematic for existing clones).

See the section "Removing a File from Every Commit" from the Pro Git book.

git filter-branch --tree-filter 'rm -f bigFile' -- --all

Check also the section "Checklist for Shrinking a Repository" of git filter-branch.

VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
  • I don't think you can do that with a remote repository. (Naissa's trying to create a local clone.) And what makes you think no-one was able to clone the repository? – Christopher Creutzig Aug 24 '11 at 13:04
  • 1
    @Christopher: what makes me think that no-one was able to clone that repo was the early hour of the morning when I initially answered this question ;) – VonC Aug 24 '11 at 13:19