2

I have noticed that my application that is running on Tomcat 5 starts with 1gig of memory and as soon as it starts receiving requests from client, the memory starts dropping until it is down to 100MBs and troubles start from there. I am looking at /manager/status page of tomcat under JVM section where "Free Memory", "Total Memory", "Max Memory" is listed.

Is this an indicator of memory leak? Memory does not seem to be freed-up automatically even if there are no requests coming from client machines.

fglez
  • 8,422
  • 4
  • 47
  • 78
user305210
  • 401
  • 1
  • 5
  • 14
  • 2
    What do you mean, "the memory starts dropping"? – skaffman Jan 07 '11 at 19:58
  • what options related to mem do you start tomcat with? – hvgotcodes Jan 07 '11 at 20:20
  • What does exactly 'troubles start' mean? What are you observing? – fglez Jan 10 '11 at 14:11
  • @antispam: by "troubles", I mean tomcat would take extremely long time to serve data. I think this is related to memory where a thread waits until other threads free up some resources including memory – user305210 Jan 10 '11 at 15:13
  • It may be due to GC activity blocking the rest of the JVM. Read my answer and post some GC data if you need help with analysis. – fglez Jan 10 '11 at 15:40

4 Answers4

1

First you should analyze your garbage collection activity and understand GC behaviour (sawtooth pattern). Here is an explanation of GC statements.

If you get undesired long GC pauses, you should try GC tuning.

In case you're having OutOfMemory errors then you should proceed to detect memory leak.

Community
  • 1
  • 1
fglez
  • 8,422
  • 4
  • 47
  • 78
0

Enable JMX on your tomcat and monitor it using some profiler. Like the Jconsole or visualvm. If the trend indicates that the heap usage is increasing quickly over the time, there may be a memory leak. Inspect the loaded classes and you may be able to figure out what is leading to this issue.

Also, your question is not clear enough. What exactly you mean by "memory drop" ? you mean free memory reduces?

ring bearer
  • 20,383
  • 7
  • 59
  • 72
0

are you doing something in your static block and not using the objects ? What happens if you just start the server keep it running for some time and then launch the browser? if there are unused objects and if GC has run the memory available should have increased.THis may not fix your problem but you can narrow down.

srikanth
  • 1,211
  • 2
  • 9
  • 11
  • I don't have any static blocks in the application and my understanding is that once the objects go out of scope, they should be garbage collected and the memory should be freed-up eventually. But that is not what I am noticing. Even more than an hour after the last activity, memory is never freed-up – user305210 Jan 07 '11 at 21:40
0

Rather than examining your coding looking for the standard leak patterns, your best bet is to turn on GC logging and then run a tool like HP Jmeter against those logs.

You can find instructions on how to use JMeter here : http://www.javaperformancetuning.com/tools/hpjmeter/index.shtml#howto

One common pattern to look for when analyzing the the GC logs is an object which lives in multiple generations. In a normal (non-leakly) java program you will find that objects are either short lived or long lived, meaning that are created and destroyed quickly or they exist for the duration of app. If an object is short lived it will only exists for a small stable number of generations. If an object is long lived, its age will increase as the program runs. But it will only belong to a limited number of generations. If you find an object which has multiple instances with difference ages and increasing generation counts then it's likely that object is being leaked. For a slightly better explanation take a look at this presentation: http://www.hjug.org/present/Sporar-MemoryLeaks.pdf

Once you've identified your leaky object, the next step is to analyze the heap to see who hold refrences to the leaked object. From there the problem should be realitvely easy to identify

Karthik Ramachandran
  • 11,925
  • 10
  • 45
  • 53