12

[This question is essentially reopening git crash during rebase which never had an answer]

I'm attempting to a rebase from my 'secc' branch as:

$ git rebase main
First, rewinding head to replay your work on top of it...
fatal: Out of memory, malloc failed (tried to allocate 553656577 bytes)         # about 0.5 GB
$ git rebase --abort
No rebase in progress?

The failure is related to the fact that both branches and their common ancestor have three .dat files each of which is 0.5 GB.

How can I do a rebase in this situation?

Additional info:

  • A 'git merge main' works just fine.
  • Augmenting .gitattributes with '*.dat merge=keepTheirs' did not prevent the fatal.
  • The *.dat files do differ.
  • I'm willing to remove the *.dat files to rebase the others and then add back the *.dat. But how?
  • I'm using git 1.7.9.4
Community
  • 1
  • 1
GoZoner
  • 67,920
  • 20
  • 95
  • 145
  • Can you create a patch for the current branch, re-create the branch from where you are trying to rebase from, and apply the patch? – vcsjones Apr 15 '12 at 23:36
  • are you version controlling a video file or something? – KurzedMetal Apr 15 '12 at 23:41
  • They are flash memory images used to establish a test environment. But no matter, they are under source control now and need to be rebased with everything else under source control. – GoZoner Apr 15 '12 at 23:47
  • 1
    AFAIK (i never needed to add binaries to a repo), `git rebase` calculates the diffs for the changes in one branch and reapply them over another commit (another branch's "head"). It seems git bail out at diffing such a big binaries. If i were you, I'd move the binaries to a `git submodule` and keep a simple history without rebasing. Take a look at http://stackoverflow.com/questions/540535/managing-large-binary-files-with-git – KurzedMetal Apr 16 '12 at 04:40
  • Another related question that might be of interest to you is [Repack of Git repository fails](http://stackoverflow.com/questions/4826639). Big binary files in Git are a pain and unfortunately many people discover this too late, when there’s no easy way out. Also, why is Git not able to get the 0.5G of memory? Should be more or less peanuts on a modern machine. – zoul Apr 16 '12 at 15:01
  • @KurzedMetal I've inherited this circumstance. The actual development is done with ~200 SVN modules in one 'src' directory. I've overlaid one GIT repository on the entire 'src' tree and thus can operate on all 200 modules at once. Big win. There are a couple hundred .dat files three or four of which are 0.5GB. – GoZoner Apr 16 '12 at 15:01
  • In practice, I've solved the problem by going to a larger (32GB) machine. – GoZoner Apr 16 '12 at 16:55
  • 6
    You should have a look into git annex. Git annex is an extension to git for dealing with big files. It stores only the control data within the git repository and keeps the big files itself outside. – bjhend Apr 17 '12 at 21:34
  • 2
    I was just looking over some of what's gone into git since 1.7.10, and there are some changes to reduce the memory usage for very large files. So maybe you'd have some luck with the very latest version. – torek Apr 18 '12 at 04:47
  • 1
    "In practice, I've solved the problem by going to a larger (32GB) machine." - so the actual problem was lack of memory on the computer, right? – eis Apr 26 '12 at 13:56
  • Is .gitattributes tracked? If so, then by the time rebase is looking at things it's seeing the uncorrected version – jthill Apr 26 '12 at 19:51

3 Answers3

1

You won't know if your machine is big enough until failing on the 'git rebase' but by that point your directory is in a munged state. During the rebase another branch was checked out (main) so that the 'secc' changes could be applied to it. Recover, and proceed with:

git checkout secc

Having failed on the rebase, as you've noted you've got two options:

  1. If rebase is not required, go with 'git merge main'
  2. Get a bigger machine and retry 'git rebase main'

You don't have a practical option to ignore the 0.5GB files, do the rebase, and then get those giga files back.

GoZoner
  • 67,920
  • 20
  • 95
  • 145
1

In the comments you've said that doing the same with a computer that contains more, in this case 32GB, memory has resolved the issue. Based on that, I would conclude that you just had too little memory available to do this on the machine you first tried it with.

eis
  • 51,991
  • 13
  • 150
  • 199
  • 1
    When a program crashes and leaves your working directory in a trashed state (but recoverable), after which you can say 'Oh, I need a bigger machine" - well that is not a solution. – GoZoner Apr 26 '12 at 21:42
  • The problem seems to be a) too little memory available and b) git handling it improperly. a) we can do something about, b) is probably something that should be reported to Git developers – eis Apr 26 '12 at 21:55
  • ... unless, of course, this is already fixed in the latest version of Git. However, even if it would fail gracefully, you'd still have too little memory to do this. – eis Apr 26 '12 at 22:02
  • Too little memory to do what exactly - isn't a rebase just grafting one revision on to another? I guess it'd need to compute a new hash but that's about it - git doesn't load the whole WC into memory to compute the hash does it? How much memory does git actually need as a multiple of the WC size - is it really him having too little memory or git needing an unreasonable amount to do something simple? – Rup Apr 27 '12 at 08:10
  • What's the difference from user perspective? It is needing more memory than what is available on the computer. – eis Apr 27 '12 at 09:39
0

Try putting these in .git/info/attributes:

# whatever gets them...
yourfile binary -delta merge=binary
*.yourext binary -delta merge=binary

that'll cause merge conflicts if those files change, and you'll have to figure out what to do with them. check the gitattributes and rebase doc for other merge strategies, I'm not even going to name the dangerous ones here.

I'm not sure it's the actual merge trying to get the whole thing in core from running, but it seems worth a try.

jthill
  • 55,082
  • 5
  • 77
  • 137