0

My list of videos has grown too large. I run checks weekly on it to makes sure the videos are not dead and now the script stops in browser after a period of time. I run a dedicated server, but I don't want to raise my execution times. I would like it to show all the results on one page as I have it set to scroll to the bottom of the page. ( Matrix Style )

I'm just trying to get some guidance on the right direction.

<?
// Includes
include('db.php');
// Variables
$liveTotal = 0;
$deadTotal = 0;
// Query
$sql = "SELECT * FROM videos WHERE youtube <> '' ORDER BY id DESC";
$deadvideo = mysqli_query ($conn, $sql);
// Flush Buffer
ob_implicit_flush(true);
ob_end_flush();
while ($row = mysqli_fetch_assoc($deadvideo))
    {
        $video_url = @file_get_contents('https://www.youtube.com/oembed?format=json&url=http://www.youtube.com/watch?v=' . $row["youtube"]);
        if(!$video_url)
        {
            // The code below will print all of the dead videos directly to your screen if uploaded in a public directory
            $dead = '<font color="red"><strong>Dead video</strong></font> <a href="https://www.youtube.com/watch?v=' . $row["youtube"] . '">' . $row["title"] . '</a>';
            echo $dead . '<br>';
            $deadTotal++;
        }
        else
        {
            // The code below will print all of the live videos directly to your screen if uploaded in a public directory
            $live = '<font color="darkgreen"><strong>Live video</strong></font> <a href="https://www.youtube.com/watch?v=' . $row["youtube"] . '">' . $row["title"] . '</a>';
            echo $live . '<br>';
            $liveTotal++;
        }
    };
    echo '<br>' . '<strong>Scan Finished</strong>' . '<br>';
    $total = $liveTotal+$deadTotal;
    echo '<br>' . '<strong>Total Videos' . ': ' . $total;
    echo '<br>' . '<strong><font color="darkgreen">Live Videos' . ': ' . $liveTotal . '</font>';
    echo '<br>' . '<strong><font color="red">Dead Videos' . ': ' . $deadTotal . '</font>';
?>
vigeos.net
  • 41
  • 1
  • 5
  • the `file_get_contents` consume alot of time, if you do multiple asynchronous request all at one that short the time heavily, you can use curl for this: http://php.net/manual/en/function.curl-multi-init.php – Kazz Oct 28 '18 at 08:07
  • As mentioned, `file_get_contents()` is a bad way of doing this, perhaps https://stackoverflow.com/questions/981954/how-can-one-check-to-see-if-a-remote-file-exists-using-php will help. – Nigel Ren Oct 28 '18 at 08:10
  • Awesome! Thank you, I adopted curl into my code and now it actually errors with the max execution time of 30 seconds. I have 10,000 videos to check, so I understand my timeout, and that I could raise that limit. However, I would rather not just raise the limit over and over again. What would be the proper way to handle this issue? – vigeos.net Oct 28 '18 at 09:14
  • php is not ideal for this kind of work, maybe unlimited time ? or cli where is unlimited by default ? – Kazz Oct 28 '18 at 10:42
  • Thank you. I also have a version of this to run as a cron job with no output, but then I just have this intensive process running, which also times out. What would be the best way to break it up and maybe queue it up so only 5000 videos are checked at a time? – vigeos.net Oct 28 '18 at 19:50

0 Answers0