5

I am developing a rather complex web application that invokes several externally executed processes, a grails background process and reads/write from/to several files - all in one controller. All was fine until I tested it with many requests in close time proximity. When I do this, I get the following java error message in my tomcat catalina log-file:

WARNING: Exception executing accept
java.net.SocketException: Too many open files
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408)
    at java.net.ServerSocket.implAccept(ServerSocket.java:462)
    at java.net.ServerSocket.accept(ServerSocket.java:430)
    at org.apache.jk.common.ChannelSocket.accept(ChannelSocket.java:312)
    at org.apache.jk.common.ChannelSocket.acceptConnections(ChannelSocket.java:666)
    at org.apache.jk.common.ChannelSocket$SocketAcceptor.runIt(ChannelSocket.java:877)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:690)
    at java.lang.Thread.run(Thread.java:662)

At first, after some happy googeling and reading, I suspected that it might be a 'system problem', i.e. that I have to raise the limit for open files for the user that executes tomcat - but that is not the case.

Then I started to read a lot of advice on Java, not on Grails & Groovy, because if you google for this problem with Grails, you don't find so much. I now suspect that my problem is caused by "too many open streams" (or something like that) instead of too many actually open files (since the number of open files is really not THAT big).

I have a lot of operations of the following four types in one closure:

1) Opening files and writing to them:

def someFile = new File("/some/file.txt")
someFile << "Some content\n"

2) Executing commands:

def cmd = "bash some-cmd".execute()
cmd.waitFor()

3) Reading content from files:

def fileContent = new File("/some/file.txt").text

4) Reading content from files on the web:

def URL url = new URL("http://www.some.link");
def URLConnection uc = url.openConnection()
def BufferedReader br = new BufferedReader(new InputStreamReader(uc.getInputStream()))
...
br.close()

As you can see, the only thing that I explicitly close is the BufferedReader with the InputStream, I believe br.close() closes them both.

Do I have to close any of the other opened connections, or better: can I do that? What would be the command to do this? Or do you think my problem is really not caused by a "forgotten, open stream"?

My question mainly originates in the answers to Why do I get "Too many open files" errors? and IOException: Too many open files .

I am using grails 1.1.1 (I KNOW that it is outdated but I had serious problems migrating my application to the current version and I gave up on it after many hours of work), groovy 1.8.0, tomcat 6.0.28, apache 2.2.16 on ubuntu 10.10.

The answer to solve my "Too many open files" problem is very related to Stephen C's answer. It seems like the first cause of my error was indeed the not closed BufferedReader Stream. I basically transferred his java-code suggestion directly to grails:

def URL url = new URL("http://www.some.link");
def URLConnection uc = url.openConnection()
def BufferedReader br = new BufferedReader(new InputStreamReader(uc.getInputStream()))
try{
    ...
}finally{
    br.close()
}

-> This definitely solved the problem with "Too many open files" and I am now even able to see what was the real source of the problem.

Community
  • 1
  • 1
funnypixy
  • 63
  • 1
  • 7
  • I had a similar problem with UrlConnection some days ago. I solved it by reading the whole content of the stream. – Fox32 Jan 05 '12 at 11:40
  • Hhm... could you elaborate on that, i.e. give me a short code example? I am not sure whether that will really solve my problem, though. My aim with the BufferedReader/URLConnection is to check only the head of a very large file (up to 1 GB) for correct file format before transferring it to my server. – funnypixy Jan 05 '12 at 11:46
  • I don't have access to the code at the moment, but I can post it later... – Fox32 Jan 05 '12 at 11:51

3 Answers3

4

As you can see, the only thing that I explicitely close is the BufferedReader with the InputStream, I believe br.close() closes them both.

It does ... but only if it is executed.

I'm not a groovy / grails programmer, but in Java there is a common mistake that people make that can result in file descriptor leaks. For example,

InputStream is = new FileInputStream(someFile);

// do some work
...

is.close();

The problem is that the statements indicated by the ellipsis (...) may throw an exception. If they do that, the is.close() call doesn't happen, and a file descriptor is leaked. The solution (in Java) is to write the code like this:

InputStream is = new FileInputStream(someFile);
try {
   // do some work
   ...
} finally {
   is.close();
}

or (in Java 7) as:

try (InputStream is = new FileInputStream(someFile)) {
   // do some work
   ...
}
Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
  • Yes, I have read similar solutions in the answers to java-questions. This is why I am asking how this is supposed to be done in Groovy. Maybe some groovy-programmer could answer to that? – funnypixy Jan 05 '12 at 11:57
  • I basically copied your code into Groovy (the try and finally statement), and it worked. Thanks so much! – funnypixy Jan 05 '12 at 12:51
0

The answers from both @Sunil Kumar Sahoo and @Stephen C apply.

Typically you take the following steps in this order:

  • Ensure that you have your try/catch/finally blocks in place to close IO resources
  • Check your ulimit settings if dealing with UNIX/Linux boxes. This really depends on the application needs.
rimero
  • 2,383
  • 1
  • 14
  • 8
0

It seems that in your application too many files or sockets have been created. you need to increase file limit.

If you are using linux, use ulimit to set the maximum file limit for the system. In linux the default file limit is 1024

ulimit -n 2048 // to set the file limit to 2048
Claes Mogren
  • 2,126
  • 1
  • 26
  • 34
Sunil Kumar Sahoo
  • 53,011
  • 55
  • 178
  • 243
  • I have globally increased the file limit on my system in /etc/security/limits.conf, I made an entry for soft nofile and hard nofile, and my tomcat user has now permission to open a ridiculously high number of files, I have tested that my changes took effect with ulimit -n, and it did work. But this does not solve my problem. Plus when I check for the number of open files at the point in time when the error is thrown, e.g. with lsof, the number of open files is rather low. So I think this is not the solution to my problem? – funnypixy Jan 05 '12 at 11:52
  • @funnypixy - you are right. This is a bandaid fix, and not a real solution. – Stephen C Jan 05 '12 at 13:06