0

I want to fix this issue, I actually don't have a clear Idea of what is happening when my application runs over a CentOs environment and after some days I start getting the following exceptions:

2011-07-12 21:58:03,598 12155907 ERROR [org.jboss.naming.Naming] (JBoss System Threads(1)-2:) Naming accept handler stopping
java.net.SocketException: Too many open files
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408)
    at java.net.ServerSocket.implAccept(ServerSocket.java:462)
    at java.net.ServerSocket.accept(ServerSocket.java:430)
    at org.jnp.server.Main$AcceptHandler.run(Main.java:481)
    at org.jboss.util.threadpool.RunnableTaskWrapper.run(RunnableTaskWrapper.java:148)
    at EDU.oswego.cs.dl.util.concurrent.PooledExecutor$Worker.run(PooledExecutor.java:756)
    at java.lang.Thread.run(Thread.java:662)
2011-07-12 21:58:03,600 12155909 ERROR [org.jboss.naming.Naming] (JBoss System Threads(1)-2:) Naming accept handler stopping
java.net.SocketException: Too many open files
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408)
    at java.net.ServerSocket.implAccept(ServerSocket.java:462)
    at java.net.ServerSocket.accept(ServerSocket.java:430)
    at org.jnp.server.Main$AcceptHandler.run(Main.java:481)
    at org.jboss.util.threadpool.RunnableTaskWrapper.run(RunnableTaskWrapper.java:148)
    at EDU.oswego.cs.dl.util.concurrent.PooledExecutor$Worker.run(PooledExecutor.java:756)
    at java.lang.Thread.run(Thread.java:662)
2011-07-12 21:58:03,600 12155909 ERROR [org.jboss.naming.Naming] (JBoss System Threads(1)-2:) Naming accept handler stopping
java.net.SocketException: Too many open files
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408)
    at java.net.ServerSocket.implAccept(ServerSocket.java:462)
    at java.net.ServerSocket.accept(ServerSocket.java:430)
    at org.jnp.server.Main$AcceptHandler.run(Main.java:481)
    at org.jboss.util.threadpool.RunnableTaskWrapper.run(RunnableTaskWrapper.java:148)
    at EDU.oswego.cs.dl.util.concurrent.PooledExecutor$Worker.run(PooledExecutor.java:756)
    at java.lang.Thread.run(Thread.java:662)
2011-07-12 21:58:03,601 12155910 ERROR [org.jboss.naming.Naming] (JBoss System Threads(1)-2:) Naming accept handler stopping
java.net.SocketException: Too many open files
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408)
    at java.net.ServerSocket.implAccept(ServerSocket.java:462)
    at java.net.ServerSocket.accept(ServerSocket.java:430)
    at org.jnp.server.Main$AcceptHandler.run(Main.java:481)
    at org.jboss.util.threadpool.RunnableTaskWrapper.run(RunnableTaskWrapper.java:148)
    at EDU.oswego.cs.dl.util.concurrent.PooledExecutor$Worker.run(PooledExecutor.java:756)
    at java.lang.Thread.run(Thread.java:662)

The logs start growing quick... I am using some JBOSS QUEUES in my application, as well some JMX connections, I want to know how to detect the problem quickly or if this is caused by other component in the computer. Any suggestions, please I am totally worried about this issue.

skaffman
  • 398,947
  • 96
  • 818
  • 769
MadMad666
  • 955
  • 3
  • 11
  • 19

2 Answers2

5

Here are some tips on debugging a Too Many Open Files situation. Two unix commands that can be of help are ulimit and lsof

I suggest you man ulimit first to understand that you can alter the maximum number of open files for a process. Just typing ulimit on the commandline will give you the default value for ulimit. For example, for me:

$ ulimit
unlimited

I'm running on a vanilla Ubuntu 11.04 distro, so looks like unlimited is the default. On most of my production boxes, the default is 1024.

Next, lsof -p <pid> which will list all open files for process with id <pid>.

Perhaps you will find that you are either not closing files when thought you were or you are just opening too many files (based on the limit set up ulimit)?

Next step, you are running a Java process so you can get a thread dump of your process while the Too Many Open Files situation is occurring. To get a thread dump, either send the process a kill -3 <pid> or if you have started the Java process in your current shell, you can type CTRL-Break to get a thread dump. It's especially helpful to gather 3 or more thread dumps within, say a minute or two, and from this collection of thread dumps, so the threads that exist in a thread dumps are worth taking a look at.

If none of this helps you, search SO for "too many open files". I did and found this link, which the accepted answer may help you

Java Too Many Open Files

Community
  • 1
  • 1
mrk
  • 4,999
  • 3
  • 27
  • 42
  • Thanks for the information. Sorry I didn't understand what can I do with the kill -3 command. – MadMad666 Jul 13 '11 at 15:46
  • Also I have installed one of the last JVM java version "1.6.0_26" Java(TM) SE Runtime Environment (build 1.6.0_26-b03) Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode) – MadMad666 Jul 13 '11 at 15:47
  • Also remember that network connections count as open files on linux. I've run into the "too many open files" problem with both Tomcat and ActiveMQ because of excessive open connections. – Ryan Stewart Jul 13 '11 at 15:48
  • Find the process ID of your Java process, then "kill -3 ". It will give you a thread dump. I think it writes it to a file, but I don't recall exactly. Experiment with lsof first. That will probably show you your problem. – Ryan Stewart Jul 13 '11 at 15:50
  • +1 File Descriptor limitations definitely the culprit here. lsof will show what resource is open and not being closed properly – Sean Jul 13 '11 at 16:36
  • Thank you very much! I will check that. – MadMad666 Jul 13 '11 at 18:14
  • We have seen exactly this behavior with JBoss running a Mobile Device Management Server(DME by Excitor). All the things mentioned earlier applied to our case (too many sockets/files open). We increased our file limit but also added memory to the host system. This made our problems go away. – Joost Evertse Jul 13 '11 at 18:45
0

I think you are hitting this issue - AJP connection is left as CLOSE_WAIT

Jean
  • 7,623
  • 6
  • 43
  • 58
  • 1
    While the link you have provided may be the answer to the question, you should post the relevant information here. – Linger Oct 21 '12 at 04:10