3

This is a follow-up question from my previous question HERE. I was witnessing memory leaks in my Java application. Initially, I was thinking that the leak is coming from the Server component of my application. But as per other's suggestion, it wasn't.

I used a tool to dump the heap memory and visualize it with JProfiler. Apparently it is due to my suspected HashMaps. But I'm not sure as I'm not familiar how to interpret the dump.

enter image description here

Here is a brief snippet of my application's structure (it is caching some text data every 15 mins for fast retrieval of a server thread).

What is causing the leak problem? And how to identify it from the dump below? Apparently the way I do new Object() and HashMap.put() has some leaking problems?!

First starter class/main. Here I initiate 7 main HashMaps, each mapping a key (right now only one- eventually will have 16 keys) to a time-series NavigableMap of around 4000 one-liner JSON Strings.

public class MyCache {
        static HashMap <String, NavigableMap <Long, String>> map1= new HashMap <String, NavigableMap <Long, String>> ();
        static HashMap <String, NavigableMap <Long, String>> map2= new HashMap <String, NavigableMap <Long, String>> ();
        static HashMap <String, NavigableMap <Long, String>> map3= new HashMap <String, NavigableMap <Long, String>> ();
        static HashMap <String, NavigableMap <Long, String>> map4= new HashMap <String, NavigableMap <Long, String>> ();
        static HashMap <String, NavigableMap <Long, String>> map5= new HashMap <String, NavigableMap <Long, String>> ();
        static HashMap <String, NavigableMap <Long, String>> map6= new HashMap <String, NavigableMap <Long, String>> ();
        static HashMap <String, NavigableMap <Long, String>> map7= new HashMap <String, NavigableMap <Long, String>> ();

        public static void main(String[] args) throws Exception {
    new Server();
    new Aggregation();
    }
}

And then in Aggregation(), I get some texts from a HTTP resource, convert them to JSON strings, and cache them in some temporary NavigableMaps, then put them in the main HashMap (so refreshing won't affect server much).

public class Aggregation {
    static NavigableMap <Long, String> map1Temp= new ConcurrentSkipListMap <Long, String> ();;
    static NavigableMap <Long, String> map2Temp = new ConcurrentSkipListMap <Long, String> ();
    static NavigableMap <Long, String> map3Temp= new ConcurrentSkipListMap <Long, String> ();
    static NavigableMap <Long, String> map4Temp= new ConcurrentSkipListMap <Long, String> ();
    static NavigableMap <Long, String> map5Temp = new ConcurrentSkipListMap <Long, String> ();
    static NavigableMap <Long, String> map6Temp = new ConcurrentSkipListMap <Long, String> ();
    static NavigableMap <Long, String> map7Temp = new ConcurrentSkipListMap <Long, String> ();

public Aggregation(){

// loop to cache last 15 mins
while (true) {
            logger.info("START REFRESHING ...");
    for (int i = 0; i < mylist.size(); i++) {
        long startepoch = getTime(mylist.get(i).time);
        MyItem m = mylist.get(i);
        String index=(i+1)+"";

        process1(index, m.name, startepoch);
        //adds to map1Temp
        process2(index, m.name, startepoch);
        //adds to map2Temp
        process3(index, m.name, startepoch);
        //adds to map3Temp
        process4(index, m.name, startepoch);
        //adds to map4Temp
        process5(index, m.name, startepoch);
        //adds to map5Temp
        process6(index, m.name, startepoch);
        //adds to map6Temp
        process7(index, m.name, startepoch);
        //adds to map7Temp
        }

    //then `put` them in the main `HashMap` all at-once:
            MyCache.map1.put(channel, new ConcurrentSkipListMap <Long, String> (map1Temp));
            MyCache.map2.put(channel, new ConcurrentSkipListMap <Long, String> (map2Temp));
            MyCache.map3.put(channel, new ConcurrentSkipListMap <Long, String>(map3Temp));
            MyCache.map4.put(channel, new ConcurrentSkipListMap <Long, String>(map4Temp));
            MyCache.map5.put(channel, new ConcurrentSkipListMap <Long, String> (map5Temp));
            MyCache.map6.put(channel, new ConcurrentSkipListMap <Long, String> (map6Temp));
            MyCache.map7.put(channel, new ConcurrentSkipListMap <Long, String> (map7Temp));

//printing the size of all Hashmap entries. They don't grow :-/
logger.info("\t"+"map1.size(): "+MyCache.map1.get(key).size());
logger.info("\t"+"map2.size(): "+MyCache.map2.get(key).size());
//and other 5...        

//then clear the temp maps so they don't grow over and over
            map1Temp.clear();
            map2Temp.clear();
            map3Temp.clear();
            map4Temp.clear();
            map5Temp.clear();
            map6Temp.clear();
            map7Temp.clear();
    }
//sleep for 15 min until next caching cycle
Thread.sleep(cacheEvery*1000*60);
}
Tina J
  • 4,983
  • 13
  • 59
  • 125

1 Answers1

3

The memory analyser is telling you that you have 3 hulking great HashMap data structures occupying about 8GB of RAM ... including the closure key and value objects they refer to. It looks like they might be maps of maps.

That is probably the evidence of your memory leak. Your application is adding more, and more entries to the map data structures, and (presumably) not removing them. That is a form of memory leak.

(Note this is in part of the code that you didn't show us in your previous question ...)

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
  • Added all snippets of my classes. I wonder my `Hashmaps` shouldn't grow in size. Can you please take a look? – Tina J Mar 30 '19 at 14:11
  • 1
    Clearly, they do grow big. If the maps are being cleared every 15 minutes as you think, then that means that the ~8GB is 15 minutes of data. But either way, you should assume that the profiler is actually telling you the truth about the memory utilization for those maps. – Stephen C Mar 30 '19 at 14:53
  • how did you find the 8GB? I don't see a clue there! – Tina J Apr 01 '19 at 18:06
  • 1
    I miscounted. It is more than that. 3,135 + 2,627 + 2,570 + 2,378 + 2,125 + ... Over 10GB. Now I'm not an expert with that UI and it could be saying that there is some sharing between the maps. But it is still saying "a lot of memory". – Stephen C Apr 01 '19 at 22:43
  • Ok. I have no clue what those other `1670` hashmap items are. I create them during every iteration. But i was assuming the GC would clear those, which apparently didn't. – Tina J Apr 02 '19 at 14:33
  • The GC doesn't remove entries from a `HashMap` or a `ConcurrentSkipListMap`. – Stephen C Apr 02 '19 at 14:54
  • I mean, the map itselt. I create a temp one every time, so it is garbage. – Tina J Apr 02 '19 at 20:25
  • According to my reading of your code, that may not be correct. You seem to be creating *copies* of the `tempMap` instances but clearing the originals. Then you put the copies into the `MyCache` maps. If the `channel` reference changes, or if the value it points to is mutable, you are liable to get build up multiple `ConcurrentSkipListMap` instances in the `MyCache` maps. Indeed, that's what the profiler seems to be saying is happening. Believe the evidence! – Stephen C Apr 02 '19 at 23:40
  • Oh i see. That `new ConcurrentSkiplistMap` is causing some problems. Isn't those stale instances garbage? Do you know an alternative solution? – Tina J Apr 03 '19 at 12:55
  • It shouldn't be be necessary to ditch `ConcurrentSkipListMap`. The problem is most likely something about the way you are using it / them. Unfortunately, the problem isn't obvious in the *subset* of your code that you have shown us ... and (to be honest) I don't really want to be debugging your code for you. – Stephen C Apr 03 '19 at 13:16
  • yeah in each of those `process()` functions, I keep the parsed JSON info into a `new HashMap()` before putting in the temp `ConcurrentSkipListMap`. So maybe those are the other 1870 instances! – Tina J Apr 03 '19 at 17:44