11

I am very frequently getting this error in MySQL:

OS errno 24 - Too many open files

What's the cause and what are the solutions?

BenMorel
  • 34,448
  • 50
  • 182
  • 322
Naresh kumar Sanda
  • 159
  • 1
  • 1
  • 6

3 Answers3

11

I was getting the errno: 24 - Too many open files too often when i was using many databases at the same time.

Solution

  • ensure that the connections to db server close propertly
  • edit /etc/systemd/system.conf. Uncomment and make

     DefaultLimitNOFILE=infinity
     DefaultLimitMEMLOCK=infinity
    

    then run systemctl daemon-reload and service mysql restart.

You can check the results with the query: SHOW GLOBAL VARIABLES LIKE 'open_files_limit' and you may notice that the value has changed. You should not have any errno 24 now.

Please notice that the solution may differ from other OS/versions. You can try to locate the variables first.Tested with Ubuntu 16.04.3 and mysql 5.7.19.

In my case it was useless to setting up the open_files_limit variable in mysql configuration files as the variable is flagged as a readonly.

I hope it helped!

giannis.epp
  • 1,489
  • 1
  • 11
  • 9
  • Hi, I had enough issues with mysql and memory leeks no to allow it LimitMEMLOCK=infinity. Any clues on how to math ou that value? – Antony Gibbs Aug 02 '18 at 08:04
  • Open connections have nothing to do with this error. A table can be closed as soon as it is no longer in use. – LinAlg Nov 14 '18 at 00:58
4

You probably have a connection leak in your application, that is why open connections are not closed once the function completes it's execution.

I would probably look into the application code and see where the connections/preparedstatement (if it's java) objects are not closed and fix it.

A quick workaround is to increase ulimit of the server (explained here) which would increase number of open file descriptors (i.e. connections). However, if you have a connection leak, you will encounter this error again, at later stages.

Darshan Mehta
  • 30,102
  • 11
  • 68
  • 102
2

I faced the same problem and found a solution on another stackoverflow-question.

By running the following snippet with Bash:

ulimit -n 30000
F. Müller
  • 3,969
  • 8
  • 38
  • 49
Geoffroy
  • 53
  • 6