1

I have a website that periodically gets a large number of sleeping php processes. My hosting service sets a limit of 20 concurrent running processes. If it goes over the limit my site goes down with a 503 error.

It is a rare occurrence and doesn't seem to have any correlation to the number of people visiting my site.

As a safeguard I would like to have a cron job with a php script that would kill php processes that have been sleeping for over 10 min.

I have a php function that will kill all sleeping MySql processes that have been sleeping for more than 10 min;

  public function kill_sleeping_mysql_processes()
  {
     $result = $this->db->query("SHOW FULL PROCESSLIST");
    foreach($result->result_array() as $row)
    {
        if ($row['Command'] == "Sleep" && $row['Time'] > 600)
        {
            $this->db->query("KILL {$row['Id']}")
        }
    }
  }

The question is how can do I do the same with php processes?

I can get a read out of php processes with this code.

exec("ps aux | less", $output);

and I can kill specific php processes with this code if I have the pid;

$pid = 11054;
exec("kill -9 $pid");

But how can I selectively kill php processes that have been sleeping more than 10 min?

Thomas Dickey
  • 51,086
  • 7
  • 70
  • 105
Rob Fenwick
  • 169
  • 11
  • Just curious if you've checked your logs and tried to resolve the root issue? – ficuscr Aug 21 '15 at 19:32
  • yes I've tried to resolve the root issue to no avail ...I have also contacted my host's support several times for help. – Rob Fenwick Aug 21 '15 at 19:42
  • It goes weeks or months without a problem then all of a sudden I get a slew of stalled scripts that takes my site down. – Rob Fenwick Aug 21 '15 at 19:43
  • That stinks. Consider writing this as a bash script and invoking it with cron. see: http://stackoverflow.com/questions/5161193/bash-script-that-kills-a-child-process-after-a-given-timeout Good luck. – ficuscr Aug 21 '15 at 19:46
  • What are your configuration options for PHP? Can you limit the number of workers before you hit the 20 limit? – sbrattla Aug 21 '15 at 20:59
  • I don't know but just off hand wouldn't that just cause the site to shut down before I hit 20? I need the processes that haven't been sleeping for over 10 min to keep working. I know of a way to kill ALL php processes but I don't want to do that, some are actually doing something. – Rob Fenwick Aug 21 '15 at 21:36
  • I was just thinking that PHP (php-fpm specifically) can be configured with different worker types (static, dynamic, on demand). Depending on how workers are spawned, you have different options. I know there is a setting which controls how many requests a worker may handle before it gets killed (to prevent memory leaks accumulating). There might be other options as well? – sbrattla Aug 22 '15 at 07:23

2 Answers2

0

I cobbled something together. It is not elegant and is a bit of a hack but it seems to work, although I am going to test it further before putting in a cron job.

    public function kill_dormant_php_processes()
    {
        $output_array = array();

        exec("ps aux | grep -v grep", $ps_output);

        array_shift($ps_output);

        if (count($ps_output) > 0)
        {
            $i = 0;
            foreach ($ps_output as $ps) 
            {
                $ps = preg_split('/ +/', $ps);
                $output_array[$i]->pid = $ps[1];
                $output_array[$i]->stat = $ps[7];
                $output_array[$i]->time = $ps[9];
                $i++;
            }
        }

        if( ! empty($output_array))
        {
            foreach ($output_array as $row) 
            {
                if( $row->stat == 'S' && date('H:i', strtotime($row->time)) > date('H:i', strtotime('00:01')))
                {
                    exec("kill -9 $row->pid");
                }
            }
        }
    }

I am sure there must be a better way to do it.

Could someone explain why 00:01 in the read out seems to translate to 6 min?

freedom   6933  6.0  0.1  57040 13040 ?        S    16:55   0:01 /usr/local/bin/php53.cgi -c .:/home/freedom/:/etc index.php
Rob Fenwick
  • 169
  • 11
0

As an alternative to the PHP script shared here, you can use the killall command with an "older than" time filter (using the -o option) to kill all those processes.

This command for example will kill all php-cgi processes that have been running for more than 30 minutes:

killall -o 30m /usr/bin/php-cgi
Bramus
  • 1,732
  • 14
  • 19