1

I want to run ffmpeg processes from laravel jobs, but not too many all at once. I just can't seem to get it right. No matter what I set for $process_limit - It only runs one at a time and there are long delays in-between. Perhaps I'm using public $timeout wrong. Perhaps retryUntil(). I don't know.

<?php

namespace FuquIo\LaravelFfmpeg;

use Cocur\BackgroundProcess\BackgroundProcess;
use Illuminate\Bus\Queueable;
use Illuminate\Queue\SerializesModels;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Support\Facades\Log;
use Illuminate\Support\Facades\Redis;

class RenderMpeg4ToWebmJob implements ShouldQueue{
use Dispatchable, InteractsWithQueue, Queueable;

public $timeout = 3600;

/**
 * @var string
 */
private $input_file;
/**
 * @var array
 */
private $map;

/**
 * Create a new job instance.
 *
 * @param array  $map
 * @param string $input_file
 */
public function __construct(array $map, string $input_file){
    //
    $this->map        = $map;
    $this->input_file = $input_file;
}

/**
 * Execute the job.
 *
 * @return void
 * @throws \Exception
 */
public function handle(){

    $almost_timeout = $this->timeout - 100;
    $map            = $this->map;
    $input_file     = $this->input_file;

    $cmds = '(' . implode('; ', config('fuqu-ffmpeg.command')) . ')';
    $cmds = str_replace(array_keys($map), array_values($map), $cmds);

    Log::debug($cmds);

    $process_limit = config(ServiceProvider::SHORT_NAME .'.process_limit');
    Redis::funnel('ffmpeg')->limit($process_limit)->then(
        function () use ($cmds, $input_file, $almost_timeout){

            $process = new BackgroundProcess($cmds);
            $process->run();

            if(!$process->isRunning()){
                throw new \Exception('Unable to execute file processing command on ' . $input_file);
            }

            /**
             * This doesn't prevent an additional
             * background process from spawning
             * but it does give a head start
             */
            $slept = 0;
            do{
                sleep(10);
                $slept += 10;
            }while($process->isRunning() and ($slept < $almost_timeout));

        }, function (){
        // Could not obtain lock...

        return $this->release(100);
    });


}

/**
 * Rather than doing x tries,
 * just keep trying until.
 *
 * @return \DateTime
 */
public function retryUntil(){
    return now()->addDays(1);
}

}

Tarek Adam
  • 3,387
  • 3
  • 27
  • 52
  • ARG!! probably release 100 is my problem with long delays. Still not getting X simultaniously thru the funnel – Tarek Adam Jun 12 '19 at 20:23
  • If you're trying to do this within the structure of the laravel framework I don't think you want to attempt to manage the number of processes running from within the job. You configure your queue and then run the desired number of "queue workers" to handle jobs for that queue. – wheelmaker Jun 12 '19 at 20:41
  • @wheelmaker agreed, that would normally be the case. However ffmpeg can run for ages. so the php max execution becomes an issue. Also, I'm not really trying to "manage" the proc. Just get the most effective funnel I can given php max exec time issues. – Tarek Adam Jun 12 '19 at 20:44
  • as @wheelmaker told you, it will be easy for you if you manage them with supervisor, check the doc https://laravel.com/docs/5.8/queues#supervisor-configuration – Thamer Jun 12 '19 at 20:44
  • @Thamerbelfkih If you have any thoughts on my reply to wheelmaker, I'd really appreciate it. – Tarek Adam Jun 12 '19 at 20:46
  • This code doesn't funnel ffmpeg... just the spawning of ffmpeg - if that makes sense. ffmpeg will run too long to be managed by php. – Tarek Adam Jun 12 '19 at 20:53
  • Roughly how long are you talking? Here's an example configuration for someone that has jobs that run about 15 minutes (with a 20 minute cut off), https://medium.com/@williamvicary/long-running-jobs-with-laravel-horizon-7655e34752f7 – wheelmaker Jun 12 '19 at 20:58
  • @wheelmaker Hey bud. Could be 2 or 3 minutes or 2 or 3 hrs, or longer. All depends on the mp4 being rendered to webm. And the background process command actually comes from the config - so someone could do 2pass encoding. So I just have to avoid php max ex. Anyway - as it turns out... this thing actually works. lol. It was just having problems with file names that end in numbers. So... my bad. – Tarek Adam Jun 12 '19 at 21:04
  • glad to hear it's working, supervisor can handle jobs that last for hours, if you ever want to switch over to handling jobs in queues through the laravel framework – wheelmaker Jun 12 '19 at 21:08
  • @wheelmaker Interesting. So I would create and destory a supervisor daemon from laravel? I typically just use supervisor on que:work or horizon. I've never created and destoryed supervisors on the fly. Also, I've only used daemons to keep things alive forever rather than temporary. Never thought about it I guess. Thanks! – Tarek Adam Jun 12 '19 at 23:23

1 Answers1

0

It turned out the code in the question actually worked. My problem as within the bg process. It was having trouble with files that ended in a number. I'll leave the code up I guess... could be useful for someone.

Tarek Adam
  • 3,387
  • 3
  • 27
  • 52