0

I understand that my question is too vague, but here is the problem. I parse a very large file creating a linked list of my objects (BTW, they are not related to bitmaps). At some moment I get infamous 'OutOfMemoryError'. Right before that I see my own:

available 478.0M, 47.5%, low=false
message in logcat that I create with:
        ActivityManager.MemoryInfo mi = new ActivityManager.MemoryInfo();
        ActivityManager activityManager = (ActivityManager)getSystemService(ACTIVITY_SERVICE);
        activityManager.getMemoryInfo(mi);
        double availableMegs = mi.availMem / 0x100000L;
        double percentAvail = mi.availMem / (double)mi.totalMem * 100.0;

Note the large amount of free memory.

After that I see a bunch of GC messages, the last of which are:

I/art: Forcing collection of SoftReferences for 184B allocation
I/art: Clamp target GC heap from 256MB to 256MB
I/art: Alloc concurrent mark sweep GC freed 3(96B) AllocSpace objects, 0(0B) LOS objects, 1% free, 252MB/256MB, paused 539us total 41.854ms
I/System.out: java.lang.OutOfMemoryError: Failed to allocate a 184 byte allocation with 3403124 free bytes and 3MB until OOM; failed due to fragmentation (required continguous free 16384 bytes for a new buffer where largest contiguous free 4096 bytes)

Obviously I catch this error, even if I was never able to break-point it. The discrepancy between GC and ActivityManager is also amazing. The problem is that after that I cannot continue because GUI cannot allocate memory for its data. In other words, catching this error is too late. So I need some recommendations how to determine when to stop, so that the program can still continue. Please note that I do not need recommendations how to optimize my data, because however good it will be, I can always crash the program with a larger file.

Martin Zeitler
  • 1
  • 19
  • 155
  • 216
Alex B
  • 347
  • 1
  • 3
  • 9
  • *"I do not need recommendations how to optimize my data, because however good it will be, I can always crash the program with a larger file"* Then perhaps you should be asking for recommendations on a better **algorithm**, instead of better data-optimization, so you don't need to load the entire file into memory, and hence run out of memory. Similar to how XML stream-parsing can parse unlimited XML, while parsing XML into DOM will run out of memory. – Andreas Jan 29 '20 at 02:24
  • `availMem` refers to system-wide memory. It does not represent the amount of free Java heap space available to your app. Also, the JVM uses a compacting heap; fragmentation may occur, but it will be eliminated when necessary. Consider how epic your fragmentation would have to be, if you could not allocate 184 bytes. Your app is simply running out of Java heap space. See [here](https://stackoverflow.com/questions/2630158/detect-application-heap-size-in-android). – greeble31 Jan 29 '20 at 14:35
  • @greeble31 good point about availMem, will use JVM API instead. I think I did not make myself clear. – Alex B Jan 29 '20 at 16:39
  • At the moment I parse 35 MB file, creating 155K objects and after some modifications my program does not crash. The problem is still there. I can have 70MB long file, 100 MB, etc. Whatever good the algorithm and data are, at some moment I get OutOfMemoryError. I guess, the right question would be: Assuming some MIN_MEMORY_SIZE_LIMIT, I want to find out when this chunk of memory becomes unavailable. Then I can abort processing file and still let program continue. – Alex B Jan 29 '20 at 16:46
  • The typical approach is to figure out how much memory you have to work with, then only take on the amount of work you can do without exceeding that. Some estimation is required, b/c the amount of memory any particular Java Object takes up is not, in general, precisely knowable. You can measure the amount of remaining heap at any time using the linked question, but keep in mind this is a racy measurement, and it excludes objects that will be collected during the next GC. It's best to be conservative. – greeble31 Jan 29 '20 at 17:09
  • Figuring out how much memory I can use is the same as figuring out how much is left. Unfortunately, it does not help, b/c this estimate does not take into account the fragmentation. Note that the program crashes not b/c there is no memory (and, BTW, my onTrimMemory() is not called). Again, is there a method to estimate the largest available contiguous chunk of memory or query if a specific size chunk is still available? – Alex B Jan 29 '20 at 22:24
  • How can I know I there is still enough? Look ot the logs: – Alex B Jan 30 '20 at 03:40
  • Offset 1%, free 3616512, 1.3% Offset 2%, free 3142576, 1.2% Offset 3%, free 3052576, 1.1% Offset 4%, free 3898928, 1.5% Offset 5%, free 0, 0.0% Offset 6%, free 0, 0.0% Offset 7%, free 402608, 0.1% Offset 8%, free 0, 0.0% Offset 9%, free 0, 0.0% Offset 10%, free 0, 0.0% Offset 11%, free 1580608, 0.6% ... Offset 35%, free 0, 0.0% ... GC and then OutOfMemoryError "Failed to allocate a 184 byte allocation with 4194304 free bytes and 5MB until OOM; failed due to fragmentation (required continguous free 16384 bytes for a new buffer where largest contiguous free 8192 bytes)" – Alex B Jan 30 '20 at 03:48
  • Note that JVM reports about 0 free memory at 5% of the file, then periodically GC frees memory and it goes on until 35% of the file is processed. I am using Runtime for probing. – Alex B Jan 30 '20 at 03:51
  • Ah, I see, I missed the part where it said error was due to fragmentation. Apparently not all systems will compact memory in the foreground, see [here](https://stackoverflow.com/a/50624536/6759241). "is there a method to... query if a specific size chunk is still available"? What you're looking for is `new`. If it throws an OOM, then that chunk size is not available. Anyway, this is a dangerous condition to be running a program in. What if Android needs to allocate 184 bytes to update a view, or handle a keypress? Best to increase your safety margin. – greeble31 Jan 30 '20 at 14:26
  • If you don't want to be lied to about the amount of available memory, e.g., `Offset 5%, free 0, 0.0%`, then it becomes time to talk about your algorithm. – greeble31 Jan 30 '20 at 14:27
  • Also, a dangerous technique I used once, maybe you are desperate enough to use it: Allocate a big `byte` array, and don't do anything with it. If you get an OOM, free the array, and do a `System.gc()`. Viola; you now have a little chunk of contiguous emergency memory. – greeble31 Jan 30 '20 at 14:30
  • Interesting! It looks exactly what I need. Why is it dangerous? – Alex B Jan 30 '20 at 15:31
  • 1.) You have to ensure the array code isn't optimized out by the JVM, 2.) there's no guarantee that the memory will actually be freed by the time the `System.gc()` returns, and 3.) if it is freed, there's no guarantee that something else won't get it first. – greeble31 Jan 30 '20 at 18:12
  • It sounded like great idea, but somehow did not work. I instantiate 5MB buffer before start, end processing on OOM, free the buffer, wait for 500 msec and continue. Unfortunately it seems that once the system gets OOM, it marks the program as bad and closes JVM regardless whether I free some memory. And the messages sound suspicious. In spite that I take 5MB, the processing ends practically on the same file offset. and it claims that Failed to allocate a 390 bytes with 2367640 free bytes and 2MB until OOM; ... required continguous 32768 bytes ... where largest contiguous free 0(!!) bytes" – Alex B Jan 31 '20 at 01:56
  • I have no way of knowing if you did everything correctly, particularly in regard to point #1, but you might be able to debug that yourself by the use of a `WeakReference`; it should start returning `null` at some point after the reference to the array is cleared. But that gives me another idea, that I've never used personally: You can use a `SoftReference` to hold on to the array, and when it starts returning `null`, you know the garbage collector is starting to panic. (You would do this instead of waiting for an OOM). Check out the Android `SoftReference` docs; very interesting. – greeble31 Jan 31 '20 at 15:33
  • I definitely will. Meanwhile, I found out that the sources of fragmentation are pattern.matcher() and String.replaceAll(). Once I removed them, instantiating 5MB buffer started working. – Alex B Jan 31 '20 at 20:55

0 Answers0