20

I'm going to use a SoftReference-based cache (a pretty simple thing by itself). However, I've came across a problem when writing a test for it.

The objective of the test is to check if the cache does request the previously cached object from the server again after the memory cleanup occurs.

Here I find the problem how to make system to release soft referenced objects. Calling System.gc() is not enough because soft references will not be released until the memory is low. I'm running this unit test on the PC so the memory budget for the VM could be pretty large.

================== Added later ==============================

Thank you all who took care to answer!

After considering all pro's and contra's I've decided to go the brute force way as advised by nanda and jarnbjo. It appeared, however, that JVM is not that dumb - it won't even attempt garbage collecting if you ask for a block which alone is bigger than VM's memory budget. So I've modified the code like this:

    /* Force releasing SoftReferences */
    try {
        final List<long[]> memhog = new LinkedList<long[]>();
        while(true) {
            memhog.add(new long[102400]);
        }
    }
    catch(final OutOfMemoryError e) {
        /* At this point all SoftReferences have been released - GUARANTEED. */
    }

    /* continue the test here */
JBM
  • 2,930
  • 2
  • 24
  • 28
  • IMHO you are using a very brittle (and non deterministic?) test setup to test the Java VM soft reference functionality, instead of testing your own application logic. – Ruben Sep 25 '10 at 09:39
  • Yes, I agree. It was one of the contra's for this approach. However, with this approach you do exactly what the test must do - test the unit's contract. If I took the other way it would be testing the unit's internals - which is undesirable as it violates encapsulation, and introduces unneeded dependancy too. But you're right - in my test it depends on the VM's behavior in case of memory crisis. In the java standard it is said that VM *will* release soft references before throwing OutOfMemoryError - well, let's hope that it does indeed. – JBM Sep 27 '10 at 08:22
  • 3
    +1, this helped me debug an issue where I thought Swing was leaking memory but it was really just holding soft references to some application objects. – casablanca Dec 27 '11 at 09:44
  • possible duplicate of [How to cause soft references to be cleared in Java?](http://stackoverflow.com/questions/457229/how-to-cause-soft-references-to-be-cleared-in-java) – Raedwald Oct 05 '13 at 10:04
  • 1
    It is worth noting that when using this answer with parallel unit tests, be aware that it could cause OOM in the other tests. Thus when running in parallel, combine with some kind of lock to make this test run by itself. eg See JUnit 5.7's @Isolated annotation. – Chris K Sep 06 '20 at 07:35

5 Answers5

17

This piece of code forces the JVM to flush all SoftReferences. And it's very fast to do.

It's working better than the Integer.MAX_VALUE approach, since here the JVM really tries to allocate that much memory.

try {
    Object[] ignored = new Object[(int) Runtime.getRuntime().maxMemory()];
} catch (OutOfMemoryError e) {
    // Ignore
}

I now use this bit of code everywhere I need to unit test code using SoftReferences.

Update: This approach will indeed work only with less than 2G of max memory.

Also, one need to be very careful with SoftReferences. It's so easy to keep a hard reference by mistake that will negate the effect of SoftReferences.

Here is a simple test that shows it working every time on OSX. Would be interested in knowing if JVM's behavior is the same on Linux and Windows.


for (int i = 0; i < 1000; i++) {
    SoftReference<Object> softReference = new SoftReferencelt<Object>(new Object());
    if (null == softReference.get()) {
        throw new IllegalStateException("Reference should NOT be null");
    }

    try {
        Object[] ignored = new Object[(int) Runtime.getRuntime().maxMemory()];
    } catch (OutOfMemoryError e) {
        // Ignore
    }

    if (null != softReference.get()) {
        throw new IllegalStateException("Reference should be null");
    }

    System.out.println("It worked!");
}
TWiStErRob
  • 44,762
  • 26
  • 170
  • 254
David Gageot
  • 2,496
  • 1
  • 22
  • 13
  • That's of course not fool proof either if Runtime.getRuntime().maxMemory() returns a value outside the positive range of an int. E.g. if maxMemory() returns a value between 2GB and 4GB, the cast will result in an attempt to create an array with a negative size. – jarnbjo Sep 29 '10 at 07:27
  • This doesn't work - at least because maxMemory() returns amount of **bytes** but you're trying to allocate **Objects**. I've tried it in my test; thought the OOME is thrown, the garbage collection does not occur prior to that. But this approach will be more efficient than loop-allocating small arrays if you allocate bytes instead of Objects and calculate the proper size given the maxMemory value. – JBM Sep 29 '10 at 07:53
  • 1
    Hmm, after trying this for few times I get inconsistent results - sometimes it works and sometimes it doesn't... Maybe http://stackoverflow.com/questions/457229/how-to-cause-soft-references-to-be-cleared-in-java/458224#458224 could be the answer why. If it's true then it's better to use the loop-based allocation. – JBM Sep 29 '10 at 08:28
5

An improvement that will work for more than 2G max memory. It loops until an OutOfMemory error occurs.

@Test
public void shouldNotHoldReferencesToObject() {
    final SoftReference<T> reference = new SoftReference<T>( ... );

    // Sanity check
    assertThat(reference.get(), not(equalTo(null)));

    // Force an OoM
    try {
        final ArrayList<Object[]> allocations = new ArrayList<Object[]>();
        int size;
        while( (size = Math.min(Math.abs((int)Runtime.getRuntime().freeMemory()),Integer.MAX_VALUE))>0 )
            allocations.add( new Object[size] );
    } catch( OutOfMemoryError e ) {
        // great!
    }

    // Verify object has been garbage collected
    assertThat(reference.get(), equalTo(null));

}
emmby
  • 99,783
  • 65
  • 191
  • 249
1
  1. Set the parameter -Xmx to a very small value.
  2. Prepare your soft reference
  3. Create as many object as possible. Ask for the object everytime until it asked the object from server again.

This is my small test. Modify as your need.

@Test
public void testSoftReference() throws Exception {
    Set<Object[]> s = new HashSet<Object[]>();

    SoftReference<Object> sr = new SoftReference<Object>(new Object());

    int i = 0;

    while (true) {
        try {
            s.add(new Object[1000]);
        } catch (OutOfMemoryError e) {
            // ignore
        }
        if (sr.get() == null) {
            System.out.println("Soft reference is cleared. Success!");
            break;
        }
        i++;
        System.out.println("Soft reference is not yet cleared. Iteration " + i);
  }
}
nanda
  • 24,458
  • 13
  • 71
  • 90
  • Is it possible to set -Xmx from within the test? I don't have any configuration server for running tests... – JBM Sep 24 '10 at 09:57
  • In theory -Xmx is not really needed. I suggested that to make your test run not for a long time. – nanda Sep 24 '10 at 09:58
0

You could explicitly set the soft reference to null in your test, and as such simulate that the soft reference has been released.

This avoids any complicated test setup that is memory and garbage collection dependend.

Ruben
  • 9,056
  • 6
  • 34
  • 44
-1

Instead of a long running loop (as suggested by nanda), it's probably faster and easier to simply create a huge primitive array to allocate more memory than available to the VM, then catch and ignore the OutOfMemoryError:

    try {
        long[] foo = new long[Integer.MAX_VALUE];
    }
    catch(OutOfMemoryError e) {
        // ignore
    }

This will clear all weak and soft references, unless your VM has more than 16GB heap available.

jarnbjo
  • 33,923
  • 7
  • 70
  • 94
  • that doesn't work : Integer.MAX_VALUE is outside of possible array size on VM. – nanda Sep 24 '10 at 10:05
  • @nanda, no, it may very well be `Integer.MAX_VALUE`. The only restriction of the size is that it should be an integer, and that it should be non-negative. – aioobe Nov 18 '10 at 09:57
  • There are two problems with this approach: 1. the max. array size is JVM dependent and is usually a few bytes less than `Integer.MAX_VALUE`, see http://stackoverflow.com/q/3038392/2513200 2. Available memory can be larger than `Integer.MAX_VALUE` – Hulk Feb 16 '17 at 14:16
  • Just tried this - it seems that on my JVM the max array size is `Integer.MAX_VALUE - 2` - the test works with this array size. If I allocate a larger array, e.g. `Integer.MAX_VALUE - 1`, an `OutOfMemoryError` gets thrown but the soft references are not cleared. – Hulk Feb 16 '17 at 14:30