3

I'm running my application on Amazon, I deployed a version that separates the log files by tenant and date, this has meant that at one point my environment did not accept more requests, throwing an IOException.

29-Mar-2016 11:56:07.939 SEVERE [http-nio-8080-Acceptor-0] org.apache.tomcat.util.net.NioEndpoint$Acceptor.run Socket accept failed
 java.io.IOException: Too many open files
    at sun.nio.ch.ServerSocketChannelImpl.accept0(Native Method)
    at sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:241)
    at org.apache.tomcat.util.net.NioEndpoint$Acceptor.run(NioEndpoint.java:688)
    at java.lang.Thread.run(Thread.java:745)

How do I fix this?

Luan Kevin Ferreira
  • 1,184
  • 2
  • 16
  • 40
  • 1
    Post your `` configuration from Tomcat's `conf/server.xml`. Also, you might want to phrase your question in form is a question, other than "anyone seen this"? Perhaps "how do I fix this"? – Christopher Schultz Mar 30 '16 at 14:36
  • 1
    You are asking specifically about AWS, but your question is just a general Linux question. If you search for "Linux too many files open" you will find plenty of answers. For example: http://stackoverflow.com/questions/20901518/ubuntu-too-many-open-files – Mark B Mar 30 '16 at 14:45
  • Thanks @ChristopherSchultz – Luan Kevin Ferreira Mar 30 '16 at 16:32

2 Answers2

1

Bumped into this problem today. It turns out that there's a soft limit for open file descriptors (essentially a number that uniquely identifies an open file). These limits are set per process.

$ ulimit -S -n
256
$ ulimit -S -n 1024
1024
Syntax
      ulimit [-abcdefHilmnpqrsStTuvx] [limit]

Key
   -S   Set a soft limit for the given resource.
   -n   The maximum number of open file descriptors. 

see docs https://ss64.com/bash/ulimit.html

Julian Tellez
  • 802
  • 7
  • 6
0

I solved my problem following that tutorial

https://easyengine.io/tutorials/linux/increase-open-files-limit/

Luan Kevin Ferreira
  • 1,184
  • 2
  • 16
  • 40