1

I'm coding up a website back-end that will include user-uploaded video. In order to ensure maximum accessibility, I'm compressing the uploaded videos and re-saving them as .mp4 and .webm format to cover all browsers (or as many as possible anyway). To do this, I'm running an avconv command in the PHP exec() function.

I don't want to make the user wait for the script to finish before the page loads, so I'm running the code asynchronously. My code so far is below.

exec('bash -c "exec nohup setsid avconv -i ' . $tempPath . ' -c:v libx264 ' . $transpose . ' ' . $newPath . 'mp4 > /dev/null 2>/dev/null &"');
exec('bash -c "exec nohup setsid avconv -i ' . $tempPath . ' -c:v libvpx ' . $transpose . ' ' . $newPath . 'webm > /dev/null 2>/dev/null &"');

In addition to running the exec functions, I also save the video to a database and send the user an email thanking them for uploading their video.

Here's the rub: I want the server to WAIT until the video conversion is finished, and THEN add it to the database and send the user an email. Basically, the program flow would be:

User uploads video. Video is placed in a temp folder. User is taken to a thank you page indicating their video will be up shortly. The server executes two avconv commands to convert and compress the video for web use. Once BOTH conversions are finished, the video info is added to a MySQL database, an email is sent to the user, and the original uploaded video is deleted.

It may just be my ignorance of the command line (in fact it almost definitely is), but how could I 'queue up' these commands? First do both conversions, then call a PHP script to add to the database, then delete the original video, all while being asynchronous with the original PHP script?

EDIT: I've tried queuing them up with an '&&' operator, like below:

exec('bash -c "exec nohup setsid avconv -i ' . $tempPath . ' -c:v libx264 ' . $transpose . ' ' . $newPath . 'mp4 && avconv -i ' . $tempPath . ' -c:v libvpx ' . $transpose . ' ' . $newPath . 'webm > /dev/null 2>/dev/null &"');

However, that seems to cancel out the fact that I'm running it asynchronously, since the page now seems to wait for the command to finish.

CGriffin
  • 1,406
  • 15
  • 35

3 Answers3

6

You should start an asynchronous command line php script that encodes both videos and then sends an email :

upload.php :

exec('/usr/bin/php -f encode_files.php > /dev/null 2>/dev/null &"');
echo "Files will be encoded, come back later !";

encode_files.php

exec('avconv ...'); // Synchronously ! Without > /dev/null etc ...
exec('avconv ...'); // webm ...

mail('user@user.com', 'Encoding complete ! ', 'Great ! ');

I left the call as "bash -c exec ..." but i think there are shorter ways to call php scripts asynchronously : Asynchronous shell exec in PHP You can even pass params (like the user/video id, ...)

$cmd = 'nohup /usr/bin/php -f /path/to/php/file.php action=generate var1_id=23 var2_id=35 gen_id=535 > /path/to/log/file.log & printf "%u" $!';
$pid = shell_exec($cmd);
Flunch
  • 816
  • 5
  • 17
  • 1
    Brilliant! Simple and easy to execute, I like it. The other answers were correct as well, but this one works best for my specific situation, I think. Honestly, can't believe I didn't think of it myself. – CGriffin Sep 01 '15 at 16:25
  • Just what I was looking for. If the calling script is running off php 5.6 and the /usr/bin/php version being run is 7.4, would this still work? Thats my scenario. – user1729972 Sep 07 '20 at 06:05
1

You can disconnect the PHP script from the client but leave it running to complete your tasks.

// Your preliminary stuff here ...
/// then send the user elsewhere but carry on in the background
ignore_user_abort(true);
set_time_limit(0); // i.e. forever

header("Location: thankyoubutwait.php", true);
header("Connection: close", true);
header("Content-Encoding: none\r\n");
header("Content-Length: 0", true);

flush();
ob_flush();

session_write_close();
// more of your video stuff here including database writes
// and clean up bits
// (you may end up with zombie processes though so check your logs or write statuses to files etc.)
Dean Jenkins
  • 194
  • 5
  • seems like a good idea, but wouldn't it lock out further requests if the session was started in the script? – andrew Sep 01 '15 at 16:04
  • It seems like really clever way of going about it, but I try to avoid using header redirects unless absolutely necessary (since sometimes they don't seem to work), and, as commented above, this seems like it would lock out further requests. There must be a way to do it from the server side with command line. – CGriffin Sep 01 '15 at 16:09
  • I use the same technique for something similar. It shouldn't lock anything unless you want it too. The only thing it will 'lock' will be the files on the file system that it is converting and moving. In production code the redirect should be to an absolute URL of course ... e.g. "Location: http://yourdomain/thankyoubutwait.php" otherwise they will sometimes not work. – Dean Jenkins Sep 01 '15 at 16:50
1

It's easy you just have to check the good execution of your command line like this:

// Your code before...
$command = 'bash -c "exec nohup setsid avconv -i ' . $tempPath . ' -c:v libx264 ' . $transpose . ' ' . $newPath . 'mp4 > /dev/null 2>/dev/null &"'
exec($command, $return, $status);
if($status == 0 ) {
    $command2 = 'bash -c "exec nohup setsid avconv -i ' . $tempPath . ' -c:v libvpx ' . $transpose . ' ' . $newPath . 'webm > /dev/null 2>/dev/null &"';
    exec($command2, $return2, $status2);

    if($status2==0){
        // let your user know your video traitement has been done
        // lauch a new function for alert him
    }
}


// Kill your process at end
die();
slig36
  • 187
  • 3
  • 13