13

I'm getting this error while doing a git svn rebase in cygwin

Out of memory during "large" request for 268439552 bytes, total sbrk() is 140652544 bytes at /usr/lib/perl5/site_perl/Git.pm line 898, <GEN1> line 3.

268439552 is 256MB. Cygwin's maxium memory size is set to 1024MB so I'm guessing that it has a different maximum memory size for perl?

How can I increase the maximum memory size that perl programs can use?

update: This is where the error occurs (in Git.pm):

 while (1) {
      my $bytesLeft = $size - $bytesRead;
      last unless $bytesLeft;

      my $bytesToRead = $bytesLeft < 1024 ? $bytesLeft : 1024;
      my $read = read($in, $blob, $bytesToRead, $bytesRead); //line 898
      unless (defined($read)) {
         $self->_close_cat_blob();
         throw Error::Simple("in pipe went bad");
      }

      $bytesRead += $read;
   }

I've added a print before line 898 to print out $bytesToRead and $bytesRead and the result was 1024 for $bytesToRead, and 134220800 for $bytesRead, so it's reading 1024 bytes at a time and it has already read 128MB. Perl's 'read' function must be out of memory and is trying to request for double it's memory size...is there a way to specify how much memory to request? or is that implementation dependent?

UPDATE2: While testing memory allocation in cygwin: This C program's output was 1536MB

int main() {
   unsigned int bit=0x40000000, sum=0;
   char *x;

   while (bit > 4096) {
      x = malloc(bit);
      if (x)
         sum += bit;
      bit >>= 1;
   }
   printf("%08x bytes (%.1fMb)\n", sum, sum/1024.0/1024.0);
   return 0;
}

While this perl program crashed if the file size is greater than 384MB (but succeeded if the file size was less).

open(F, "<400") or die("can't read\n");
$size = -s "400";

$read = read(F, $s, $size);

The error is similar

Out of memory during "large" request for 536875008 bytes, total sbrk() is 217088 bytes at mem.pl line 6.
Matthew Farwell
  • 60,889
  • 18
  • 128
  • 171
Charles Ma
  • 47,141
  • 22
  • 87
  • 101
  • 1
    Are you sure that Cygwin cofiguration is the issue here? Msys git comes with its own msys perl (typically `C:\Program Files\Git\bin\perl.exe`). I'm not sure what happens under Cygwin, but under win32 console use, msysgit uses its perl instead of the other perls on my system. – daotoad Dec 17 '09 at 01:26
  • Ah yes you're right, but my perl memory test uses cygwin's version of perl and it has this problem as well – Charles Ma Dec 17 '09 at 01:41

4 Answers4

9

This is a problem that has been solved in the latest version of msysgit by Gregor Uhlenheuer. There is a patch available. The problem is that in Git.pm, the file is read in one go. The solution is to read it in small chunks. I'm not sure if the fix has made it into any released versions, but the fix is easy to apply locally.

You need to change C:\Program Files\Git\lib\perl5\site_perl\Git.pm (about 8 lines change). Make sure you back it up first.

For the details of what to do, see Git.pm: Use stream-like writing in cat_blob().

The original discussion is Problems with larger files "Out of memory".

Community
  • 1
  • 1
Matthew Farwell
  • 60,889
  • 18
  • 128
  • 171
  • This error appears to still exist for `/Git/SVN.pm` in msysgit version 1.8.3, I just got this error while doing an SVN fetch into a Git repo: "Out of memory during 'large' request for 69632 bytes, total `sbrk()` is 219133952 bytes at `/usr/lib/perl5/site_perl/Git/SVN.pm` line 1292." –  Jul 01 '13 at 14:19
  • I am still getting this error while trying to clone an SVN repo using git-svn. There seems to be some memory limitation in Perl. Whenever the perl.exe process gets to ~256mb memory usage, the fetch dies with Out of memory during request for X bytes, total sbrk() is 253132800 bytes! (git version 1.9.0.msysgit.0) – Jimmy Bosse May 29 '14 at 13:55
8

Have you tried increasing overall Cygwin's usable memory?

That message shows Perl was already up to 130 MiB (total sbrk()) and then tried to request a further 256MiB which failed.

From http://www.perlmonks.org/?node_id=541750

By default no Cygwin program can allocate more than 384 MB of memory 
(program+data). You should not need to change this default in most 
circumstances. However, if you need to use more real or virtual 
memory in your machine you may add an entry in the either the 
HKEY_LOCAL_MACHINE (to change the limit for all users) or
HKEY_CURRENT_USER (for just the current user) section of the registry.

Add the DWORD value heap_chunk_in_mb and set it to the desired 
memory limit in decimal MB. It is preferred to do this in Cygwin 
using the regtool program included in the Cygwin package. (For 
more information about regtool or the other Cygwin utilities, 
see the Section called Cygwin Utilities in Chapter 3 or use 
each the --help option of each util.) You should always be 
careful when using regtool since damaging your system registry
can result in an unusable system. 
Vinko Vrsalovic
  • 330,807
  • 53
  • 334
  • 373
  • I've followed these instructions to change the memory size http://www.cygwin.com/cygwin-ug-net/setup-maxmem.html using the sample program to test for memory allocation shows 1536MB so it should be enough memory. That makes me think that the issue is with perl :S – Charles Ma Dec 17 '09 at 00:38
  • The error shows it's failing to allocate more than 384MiB, so it seems your change has not been correctly done. Have you a) verified you can actually allocate the 1.5GiB with the sample program in that manual page?, b) verified that Perl still fails when allocating 384MiB (and not when allocating more than 1.5GiB)? and c) Restarted the machine after the change (even if the instructions do not require it)? – Vinko Vrsalovic Dec 17 '09 at 00:44
  • Interesting...I was able to malloc 1.5GB of memory in C, but not malloc over 384MB of memory in perl – Charles Ma Dec 17 '09 at 01:05
  • Hmmm, maybe your test is trying to allocate contiguous memory (which is scarcer)? Anyways, there are some Google hits for Perl 384 MB and some of them refer to this problem, Perl can only allocate this amount of memory. Sadly, I haven't found a solution. – Vinko Vrsalovic Dec 17 '09 at 18:48
  • Thanks for your help anyway, I'm still looking for a solution for this issue, but for now I'm temporarily switching back to svn on windows – Charles Ma Dec 18 '09 at 21:56
  • There is now a patch available for this problem. See my answer below. – Matthew Farwell Sep 28 '11 at 12:36
5

This is not a Perl-specific issue, but rather one related to cygwin. You can raise memory allocation with ulimit.

What version of git are you using? If you're not on the latest version, this might be an inefficiency that has been fixed with the latest version (e.g. looping through a very large file with foreach rather than while, as google suggests when I did a quick search.)

Ether
  • 53,118
  • 13
  • 86
  • 159
  • 1
    git --version gives me 1.6.5.1.1367.gcd48 and I'm using the latest version of msysgit http://code.google.com/p/msysgit/ ulimit output is already 'unlimited' :S – Charles Ma Dec 17 '09 at 00:19
  • Yeah, I'm using msysgit too (version 1.8.3), not Cygwin, and I get a similar error, but in `/usr/lib/perl5/site_perl/Git/SVN.pm` during a `git svn fetch`. –  Jul 01 '13 at 14:30
5

Solution with maximizing Cygwin memory actually does not work.

Currently there are two problems with Git on Windows:

  1. Packs more than 2G are hardly supported by either MsysGit and Cygwin git
  2. Cygwin default memory amount is too small
  3. 32bit Git is problemistic

What have I done step by step:

I moved my git repo to Unix machine, set next configs:

[pack]
        threads = 2
        packSizeLimit = 2G
        windowMemory = 512M

After that I made git gc and all packs were rebuilded to 2G ones.

Double checked that MsysGit is not installed on Windows machine, other way perl from MsysGit may be used.

Moved this repo back to windows machine and raised Cygwin memory limit:

regtool -i set /HKLM/Software/Cygwin/heap_chunk_in_mb 1536

It was important to set Cygwin memory higher than pack.windowMemory×pack.threads and not higher than 1.5G

So the first two problems are now solved. But the third is not.

Unfortunally it does not work on windows. During some repacks it sometimes crashes with out of memory. Even with threads = 1 and pack.windowMemory = 16M and max depth and delta set to 250.

Timothy Basanov
  • 413
  • 6
  • 12