0

I'm working on an app which needs to store many bitmap images in memory. No surprise, that OutOfMemoryError is often caught on some devices, when bitmap collection becomes too big. And I need to avoid this somehow.

Saying briefly, I need a kind of collection which will behave a kind of "add items one by one, until adding next item would cause OutOfMemoryError", but I'm not experienced enough to find a proper approach. I believe, some sort of weak-reference collection should be implemented.

I like WeakHashMap, but there is one crucial problem with it - I cannot control how items will be discarded. In my app, bitmaps are added in order of priority: the most important bitmaps, that should be stored as long as possible, go first. WeakHashMap, as I understand, doesn't provide such prioritization.

Any working approaches or ideas?

P. S. This question is not about bitmap optimization. Imagine, there are some big objects instead of bitmaps, that cannot be compressed or optimized. The question is about storing items in memory directly controlling their priority, such that objects with low priority can be GC'd quickly or not added at all.


P. P. S. So far, I discovered two possible solutions:

1) dividing a collection in 2 parts, such that first, more prioritized, will contain items themselves (i. e. strong references), and second will contain soft references. Control of adding can be processed using Runtime.getRuntime.maxMemory() and .totalMemory()--when totalMemory occupied by the heap exceeds some % of maxMemory, adding new items into collection should be prohibited;

2) using a collection of soft references and track items' finalize()--when it is invoked (i. e. corresponding object is going to be picked by GC), return this item as a soft reference back to the collection and replace another item--with least priority--by phantom reference. Theoretically, this would give more strict priority control, but I'm not sure how it will behave in practice.

Alex Salauyou
  • 14,185
  • 5
  • 45
  • 67
  • http://developer.android.com/training/displaying-bitmaps/index.html – nKn Apr 24 '14 at 16:22
  • @nKn did you read the question? – Alex Salauyou Apr 24 '14 at 16:23
  • 1
    http://stackoverflow.com/questions/18385362/high-resolution-image-outofmemoryerror/18385448#18385448 – Adam Stelmaszczyk Apr 24 '14 at 16:23
  • Yes. Have *you* read that link? – nKn Apr 24 '14 at 16:24
  • @nKn of cource I did. My question is not about bitmap optimization... imagine there are not bitmaps, but some abstract objects, which should be stored in priority order, allowing less prioritized objects to be GC'd quickly. – Alex Salauyou Apr 24 '14 at 16:26
  • You have no way to accurately predict an `OutOfMemoryError` ahead of time. Hence, your "until adding next item would cause OutOfMemoryError" desire is not possible. "My question is not about bitmap optimization" -- it should be, if the images are all the same size, as you should be using `inBitmap` in that case to reuse `Bitmap` objects. – CommonsWare Apr 24 '14 at 16:28
  • As @CommonsWare stated. "Until you would be out of memory" is not doable. The way to avoid the error is not to store that much in memory or add more memory to your app. – Tony Hopkinson Apr 24 '14 at 16:34
  • @CSmith thanks, but `LruCache` is not a solution because there I cannot control priority directly... – Alex Salauyou Apr 24 '14 at 16:44
  • "but LruCache is not a solution because there I cannot control priority directly" -- then use it as a basis for your own cache that takes priority into account. – CommonsWare Apr 24 '14 at 16:45
  • @CommonsWare yes I think after all I will make some sort of LruCache by my own... I supposed there are some standard solutions for this purpose – Alex Salauyou Apr 24 '14 at 16:48

1 Answers1

2

The LruCache class appears to be a good candidate for this.

http://developer.android.com/reference/android/util/LruCache.html

A cache that holds strong references to a limited number of values. Each time a value is accessed, it is moved to the head of a queue. When a value is added to a full cache, the value at the end of that queue is evicted and may become eligible for garbage collection.

This appears to provide the lifetime control you desire. Further, the cache size is in your control, so you can dynamically create a cache that uses a percentage of available memory, i.e. adjusts automatically to the amount of available memory.

A related article at http://developer.android.com/training/displaying-bitmaps/cache-bitmap.html gives examples of its use, including a backing disk cache for objects that are not stored in RAM.

CSmith
  • 13,318
  • 3
  • 39
  • 42
  • It would be a great solution if priority could be controlled directly, not by calling items... – Alex Salauyou Apr 24 '14 at 16:45
  • dig into LruCache source to get better insight into how it does this, it seems the underpinnings are correct for your needs and you could perhaps roll your own LruCache implementation – CSmith Apr 24 '14 at 16:47
  • I think I will have to... I supposed there is some standard solution when asking my question – Alex Salauyou Apr 24 '14 at 16:50
  • csmith unfortunately, android's LruCache is just an ordinary fixed size LRU cache implementation. No GC control is implemented there... It is helpless in situation when we don't know definitely approximate size of objects. – Alex Salauyou Apr 24 '14 at 20:16