2

We have an application where we totally cache huge volume of data. The cache is maintained as static maps.

Since the data is modified daily by some batch cycles, we refresh the cache after the data is modified. Refreshing cache is done by creating new objects and reference the static variable to these new objects. So each day new objects will be created and old objects are dereferenced.

But the problem is server heap memory keeps on increasing until one day it crashed without of memory exception.

I really doubt whether the dereferenced objects are garbage collected.

This is my class.

Class CacheService {
 public static Map<String,Article> articleCache = null;

 public docache(){
     private Map<String,Article> tempArticleCache= new HashMap<String,Article>();

      //Caching stuff
       //finally
        articleCache = tempArticleCache; // i hope defreferencing takes place here.
 }

}

The function docache() will be called daily to update the cache. Could anyone help me achieve caching without this problem.

Peter O.
  • 32,158
  • 14
  • 82
  • 96
karthik2146
  • 21
  • 1
  • 4
  • I think this previous question might be of interest: http://stackoverflow.com/questions/1481178/forcing-garbage-collection-in-java –  Nov 22 '12 at 23:23
  • 1
    When you install the new tables you need to make sure that ALL references to the old tables are either zeroed or overwritten with the new reference value. – Hot Licks Nov 22 '12 at 23:24
  • (There are Java tools to help you solve this sort of thing. You're somehow retaining references to the old tables. I'd suggest you look at places where the contents of your cache are referenced.) – Hot Licks Nov 22 '12 at 23:25

2 Answers2

2

I am suspecting that old maps are still referenced somewhere. I would advice you to try below(not to create a new map every time but simple clear the existing one and repopulate it):

      public docache(){ 
           if(articleCache!= null){
                 //clear the elements of existing map
                 articleCache.clear();
           }else{
                 articleCache = new HashMap();
           }
           //do the map population
      }

If this also doesn't work then take a memory snapshot before crash and check which exact objects are consuming your heap. That will give a better idea about the issue.

Yogendra Singh
  • 33,927
  • 6
  • 63
  • 73
  • Should not make a difference. – Hot Licks Nov 22 '12 at 23:27
  • @HotLicks: It makes difference when my starting statement is true(i.e. reference elsewhere hence map is not getting released properly). – Yogendra Singh Nov 22 '12 at 23:28
  • Should not make a difference. The internal objects that implement Map cannot be "leaked", so it makes no difference whether you clear them or create anew. It's the CONTENTS that are leaking somehow, and how you reset the Map will have no effect on that. – Hot Licks Nov 22 '12 at 23:32
  • 1
    @HotLicks You cannot possibly know whether it is the contents or the map itself that is leaking, without access to the application – user207421 Nov 22 '12 at 23:49
  • I suppose you're right, that somewhere else there could be code that repeatedly makes copies of the Map pointer. Your change would conceal that bug, but not actually fix it. If repeated copies are being made (and retained indefinitely) then there is necessarily some data structure that is growing without bound somewhere. – Hot Licks Nov 23 '12 at 00:23
1

try to use java.util.WeakHashMap that are Maps. Entries in the WeakHashMap will be removed automatically when your object is no longer referenced elsewhere.