4

I've got an Axis2 web service that just fell over on a client site it's throwing the following exception: java.lang.OutOfMemoryError: unable to create new native thread

I'm just pulling the logs off the site but in the interim I was wondering if anyone knew of any monitoring tools I could use to find a memory leak in a web application running in Tomcat.

Omar Kooheji
  • 54,530
  • 68
  • 182
  • 238

3 Answers3

9

Make a heap dump, and analyse it with Eclipse Memory Analyzer.

Daniel
  • 27,718
  • 20
  • 89
  • 133
  • Forgive my naievety but how would I dump the JVM's heap in Apache tomcat? – Omar Kooheji Jun 14 '11 at 13:07
  • +1 for Eclipse MAT - it is very good. You could also try YourKit (but that's not free). – Paul Cager Jun 14 '11 at 13:07
  • 2
    @Omar - use the jmap command: jmap -dump:live,format=b,file=xxx.hprof pid. – Paul Cager Jun 14 '11 at 13:08
  • 1
    @Omar on a side of caution, this will freeze the heap while it takes a snapshot, try and perform this operation at non-critical times if you must take the heap dump in a production environment – Sean Jun 14 '11 at 13:25
  • 2
    @Omar or start tomcat with -XX:-HeapDumpOnOutOfMemoryError (and optionally -XX:HeapDumpPath=...) to get a dump just when tomcat is going down – sfussenegger Jun 14 '11 at 13:25
  • Can I add those lines to the java parameters when I look at the service using tomcatw.ex //MS//ServiceName? Is this something I should do as standard when I deploy tomcat instances or is there an inherent performance hit? – Omar Kooheji Jun 14 '11 at 15:38
  • We have the -XX:-HeapDumpOnOutOfMemoryError in the live system, together with the path, and even -XX:OnOutOfMemoryError=arestartcommand, to recover for sure (not the case when PermGenErrors occur) – Daniel Jun 14 '11 at 19:27
2

Try VisualVM.

johnstok
  • 96,212
  • 12
  • 54
  • 76
1

There are a few steps you can use to identify the memory leak.

Start with changing the start up parameters of the web service. Add the line -XX:+HeapDumpOnOutOfMemoryError which will capture a heap dump for you whenever the jvm encounters an OOM exception. You can use this information to get a good representation of what objects in memory were taking up all of the available memory. While waiting for the OOM to be replicated, you can look at a 2nd set of paramters to add to the start-up, the following logs the GC activity, -XX:+PrintGCDetails -verbose:gc -Xloggc:/log/path/gc.log. With this data you can see if the OOM is occuring gradually or if it is happening quickly.

Another path is to use a program like VisualVM, which can be used to profile the web service. This will attach to your running JVM (preferably on a development environment) and then attempt to stress test to find where the problem lies, Try JMeter to help with the stress Test. VisualVM is found in your JAVA_HOME/bin folder (v6 and above)

This could also be a case where it is not of a memory leak, but simply more load on the client side then expected. Look at tweaking the startup parameters to provide more memory ( -Xms and -Xmx)

Unless your client can tell you the parameters which they passed before the problems occured, you will have to do a bit of investigation yourself until you find more information.

Daniel already covered jmap in his answer, so I wont go into that detail

Sean
  • 7,597
  • 1
  • 24
  • 26