3

I just found Guava by searching for a cache API (it fits perfectly for my needs). But one question arose on reading the wiki and Javadoc - what are the default values of settings the CacheBuilder can take? The Javadoc states "These features are all optional" and "Constructs a new CacheBuilder instance with default settings, including strong keys, strong values, and no automatic eviction of any kind."

In my opinion, a good default for maximumSize would be relative to Runtime.getRuntime().freeMemory();

At the end I want a cache that uses the memory available on a given system. So I need an eviction strategy that asks how much freeMemory() is available (probably relative to Runtime.getRuntime().maxMemory())

Ray
  • 4,829
  • 4
  • 28
  • 55
dermoritz
  • 12,519
  • 25
  • 97
  • 185
  • "I want a cache that uses the memory available..." you have to realize that this is not well-defined. Two cache entries might each reference objects that take up 1K of memory. But for one, those referenced objects are also referenced by other things outside the cache, and for the other they aren't. Now what? – Kevin Bourrillion Feb 10 '12 at 20:26
  • let vm decide what to do with it if softValues() are used. and if i would make many references to stuff in cache (probably to each entry) - i wouldn't cry about a out of heap space error. So with softValues and careful treatment of cached objects - this could work couldn't it? – dermoritz Feb 13 '12 at 08:00

3 Answers3

6

Actually, free memory isn't all that great of a metric for cache eviction. The reason is because of garbage collection. Running out of free memory may just mean that it is now time for the garbage collector to run, after which you'll suddenly have lots of free memory. So you don't want to drop stuff from the cache just because you have a lot of accumulated garbage.

One option is to use softValues(), but I would strongly recommend against that, as soft references can really hurt production performance.

The right thing to do is to carefully select a maximumSize which in essence bounds the total amount of memory your cache will consume. If entries take up variable amounts of space then you can use maximumWeight instead to model that.

fry
  • 581
  • 2
  • 5
  • thx for clarrification, so the only way to get cache "environment aware"/ scale with available memory is to use softValues() - but this is not recommended? in my case all objects will take nearly the same amount of memory - but i'dont know how much. ahh and what about my "other" question: what is the default for maximumSize? – dermoritz Feb 10 '12 at 14:37
  • @dermoritz - And you might also want to factor in the cost of the Cache structure itself - Dimitris Andreou recently posted this article on calculating that: http://code-o-matic.blogspot.com/2012/02/updated-memory-cost-per-javaguava.html – Paul Bellora Feb 10 '12 at 15:52
  • 1
    In other words, we emphasize the classic "tune, monitor, tune" cycle as the way to get good cache performance. Anything else would take magic we don't have. – Kevin Bourrillion Feb 10 '12 at 20:28
  • so i have to profile my app, see what my cach entries weigh and map this to available memory (given via maxMemory())? And with this data i set the maximumSize on runtime? do anybody know the default values for the optional settings (i.e. maxSize)? – dermoritz Feb 13 '12 at 07:56
  • and there is really no production-use-case for soft values? the javadoc at sun, oh excuse oracle sounds promising?! – dermoritz Feb 13 '12 at 08:47
1

I got me questioning the same thing and could not find anything on the web for that. So I made this very primitive test. I wrote a piece of code that creates a LocalCache with the most basic setup (no maximum size, no eviction policies, nothing) and in an infinite loop puts stuff in the cache. And monitored it through VisualVm to check the heap usage.

import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;

import java.util.concurrent.TimeUnit;

public class CacheTest {
    public static void main(String[] args) {
        Cache<String, String> cache = CacheBuilder.newBuilder().build();
        int counter = 0;
        while(true){
            cache.put("key"+counter++,"value");
            System.out.println("size:"+cache.size());
        }
    }
}

As you can see from the image below, the memory usage grows to the maximum available space and becomes constant. I waited for a few minutes and no OutOfMemoryError ocurred. What happened is that after a few seconds one new entry is added to the map so there will be probably an error in the future.

Heap Dump

Conclusion: You don't have to set the maximumSize value, but I suggest you use some kind of eviction policy (expireAfterAccess or expireAfterWrite) to clean up the cache and avoid an OutOfMemoryError. And also to avoid degrading the performance of your cache.

de.la.ru
  • 2,994
  • 1
  • 27
  • 32
0

Code taken from LocalCache class

package com.google.common.cache;

@GwtCompatible(
    emulated = true
)
class LocalCache<K, V> extends AbstractMap<K, V> implements ConcurrentMap<K, V> {

int initialCapacity = Math.min(builder.getInitialCapacity(), 1073741824);
if (this.evictsBySize() && !this.customWeigher()) {
    initialCapacity = Math.min(initialCapacity, (int)this.maxWeight);


this.initTable(this.newEntryArray(initialCapacity));

Hope this answers the initial size question.

Arnav Karforma
  • 112
  • 2
  • 12