3

I have web application that runs a job to convert videos into HLS using the aminyazdanpanah/php-ffmpeg-video-streaming package. However, after about 2 minutes, the job fails and throws the error:

Symfony\Component\Process\Exception\ProcessTimedOutException: The process '/usr/bin/ffmpeg -y -i...'
exceeded the timeout of 300 seconds. in /var/www/vendor/symfony/process/Process.php:1206

The Laravel job has it's time out set to 7200s. My supervisor setup also specifies a timeout of 7200s:

[program:app_worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/artisan queue:work --tries=1 --timeout=7200 --memory=2000
autostart=true
autorestart=true

I have also set my php max_execution_time to 7200s in the ini file. In the job handle() function I also call set_time_limit(7200); to set the time limit.

I have restarted the queue worker and cleared my cache but that doesn't seem to solve the issue.

It seems Symfony just ignores the timeout specification from Laravel.

gp_sflover
  • 3,460
  • 5
  • 38
  • 48
Fanan Dala
  • 584
  • 6
  • 19

1 Answers1

1

I noticed that it failed after about 2 minutes because in my config/queue.php file redis retry_after was set to 90.

 'redis' => [
            'driver' => 'redis',
            'connection' => 'default',
            'queue' => env('REDIS_QUEUE', 'default'),
            'retry_after' => 90,
            'block_for' => null,
            'after_commit' => false,
        ],

I increased that to 3600 so the job stopped failing after 2 minutes but kept failing after 300s.

I later traced down the timeout to be coming from aminyazdanpanah/php-ffmpeg-video-streaming FFmpeg::create(). By default, the function sets a timeout of 300s. So I had to pass a config to the function to increase the time out:

use Streaming\FFMpeg;

$ffmpeg =  FFMpeg::create([
            'timeout'          => 3600,
        ]);

And this solved the timeout issue.

Fanan Dala
  • 584
  • 6
  • 19