22

I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.

<?php
include 'echove.php';

$e = new Echove(
    'xxxxx',
    'xxxxx'
);

// Read Video IDs
# Define our parameters
$params = array(
    'fields'         => 'id,referenceId'

);

# Make our API call
$videos = $e->findAll('video', $params);


    //print_r($videos);
    foreach ($videos as $video) {

        //print_r($video);
        $ref_id = $video->referenceId;
        $vid_id = $video->id;

        switch ($ref_id) {
            case "":
                $metaData = array(
                    'id' => $vid_id,
                    'referenceId' => $vid_id
                );

                # Update a video with the new meta data
                $e->update('video', $metaData);                
                echo "$vid_id updated sucessfully!<br />";
                break;
            default:
                echo "$ref_id was not updated. <br />";
                break;
        }
    }
?>

Thanks!

Dave Kiss
  • 10,289
  • 11
  • 53
  • 75
  • 1
    Just as a word of warning if you ever decide to use `set_time_limit` on a browser based app then your browser will probably time out before you receive any response. Not really an answer, more just helpful information ;) – Catharsis Oct 11 '10 at 21:50

3 Answers3

42

Try the set_time_limit() function. Calling set_time_limit(0) will remove any time limits for execution of the script.

Luke
  • 20,878
  • 35
  • 119
  • 178
bobdiaes
  • 1,120
  • 11
  • 9
  • 8
    Watch out, firefox "overwrite" this, with the `network.http.keep-alive.timeout` option – Baronth Apr 03 '13 at 08:15
  • 0 should be make script instantly time out when thinking logically. But this is exception and I do not see that in documentation. – Darius.V May 12 '23 at 11:25
  • But I got timeout even by calling this fuction with 0. Oh , actally maybe other project timed out which provides api response – Darius.V May 12 '23 at 11:37
5

Also use ignore_user_abort() to bypass browser abort. The script will keep running even if you close the browser (use with caution).

Dharman
  • 30,962
  • 25
  • 85
  • 135
Billy
  • 788
  • 1
  • 8
  • 17
0

Try sending a 'Status: 102 Processing' every now and then to prevent the browser from timing out (your best bet is about 15 to 30 seconds in between). After the request has been processed you may send the final response.

The browser shouldn't time out any more this way.

Robidu
  • 9
  • 1
  • Does this work? I'd love to use it, but as a CGI header you are not permitted to send `Status:` more than once. And how will the HTTP version of this CGI header (e.g. "HTTP/1.2 102 Processing") reach the browser? Will the first one be sent immediately? Are subsequent ones even possible/legal? – Lightness Races in Orbit Jul 08 '15 at 16:54
  • -1. Assuming the user did not close the browser and the connection was kept alive, there is no reason why the browser would be at fault for script timeout. Instead this is related to the script execution time limits of PHP itself. So this answer is 100% incorrect and should be removed so it doesnt confuse anyone. – hiburn8 Jan 19 '19 at 04:20
  • Some browsers do timeout connections: https://support.mozilla.org/en-US/questions/1042479 – Traumflug Aug 16 '19 at 17:55
  • If there is no response from the script the browser will time out. I dont kow if 102 is proper. I useally output some HTML even if it just says processing. Then just keep passing data every few seconds if your running a loop just add an echo to each loop with blank lines. – JpaytonWPD Mar 11 '20 at 02:34
  • For anyone interested this does NOT prevent Cloudflare's custom 100 seconds timeout – QuantumBlack Jun 23 '20 at 14:56