I understand that my question is too vague, but here is the problem. I parse a very large file creating a linked list of my objects (BTW, they are not related to bitmaps). At some moment I get infamous 'OutOfMemoryError'. Right before that I see my own:
available 478.0M, 47.5%, low=false
message in logcat that I create with:
ActivityManager.MemoryInfo mi = new ActivityManager.MemoryInfo();
ActivityManager activityManager = (ActivityManager)getSystemService(ACTIVITY_SERVICE);
activityManager.getMemoryInfo(mi);
double availableMegs = mi.availMem / 0x100000L;
double percentAvail = mi.availMem / (double)mi.totalMem * 100.0;
Note the large amount of free memory.
After that I see a bunch of GC messages, the last of which are:
I/art: Forcing collection of SoftReferences for 184B allocation
I/art: Clamp target GC heap from 256MB to 256MB
I/art: Alloc concurrent mark sweep GC freed 3(96B) AllocSpace objects, 0(0B) LOS objects, 1% free, 252MB/256MB, paused 539us total 41.854ms
I/System.out: java.lang.OutOfMemoryError: Failed to allocate a 184 byte allocation with 3403124 free bytes and 3MB until OOM; failed due to fragmentation (required continguous free 16384 bytes for a new buffer where largest contiguous free 4096 bytes)
Obviously I catch this error, even if I was never able to break-point it. The discrepancy between GC and ActivityManager is also amazing. The problem is that after that I cannot continue because GUI cannot allocate memory for its data. In other words, catching this error is too late. So I need some recommendations how to determine when to stop, so that the program can still continue. Please note that I do not need recommendations how to optimize my data, because however good it will be, I can always crash the program with a larger file.