1

I have a script that is very long to execute, so when i run it it hit the max execution time on my webserver and end up timing out.

To illustrate that imagine i have a for loop that make some pretty intensive manipulation one million time. How could i spread this loop execution in several parts so that i don t hit the max execution time of my Webserver?

Many thanks,

silkAdmin
  • 4,640
  • 10
  • 52
  • 83

6 Answers6

3

If you have an application that is going to loop a known number of times (i.e. you are sure that it's going to finish some time) you can increase time limit inside the loop:

foreach ($data as $row) {
    set_time_limit(10);
    // do your stuff here
}

This solution will protect you from having one run-away iteration, but will let your whole script run undisturbed as long as you need.

Artem Goutsoul
  • 733
  • 5
  • 17
1

Best solution is to use http://php.net/manual/en/function.set-time-limit.php to change the timeout. Otherwise, you can use 301 redirects to send to an updated URL on a timeout.

$threshold = 10000;
$t = microtime();
$i = isset( $_GET['i'] ) ? $_GET['i'] : 0;

for( $i; $i < 10000000; $i++ )
{
    if( microtime - $t > $threshold )
    {
        header('Location: http://www.example.com/?i='.$i);
        exit;
    }

    // Your code
}

The browser will only respect a few redirects before it stops, you're better to use javascript to force a page reload.

Alex M
  • 3,506
  • 2
  • 20
  • 23
  • 1
    Or better yet, skip the redirects and use AJAX – Leigh Jan 26 '12 at 06:57
  • Thanks amccausl, it s what i had in mind, but i was hopping to investigated maybe more elegant solution, though i think running such a script in PHP is just not elegant to start with ..Cheers – silkAdmin Jan 26 '12 at 07:59
1

I someday used a technique where I splitted the work from one file into three parts. It was just an array of 120.000 elements with intensive operation. I created a splitter script which stored the arrays in a database of the size of 40.000 each one. Then I created an HTML file with a redirect to the first PHP file to compute the first 40.000 elements. After computing the first 40.000 elments I had again a HTML forward to the next PHP file and so on.

Not very elegant, but it worked :-)

Andre
  • 9,000
  • 5
  • 26
  • 32
0

If you have the right permissions on your hosting server, you could use the php interpreter to execute a php script and have it run in the background.

See Asynchronous shell exec in PHP.

Community
  • 1
  • 1
Wes Pearce
  • 259
  • 1
  • 4
0

if you are running a script that needs to execute for unknown time, you can use:

set_time_limit(0);

Zul
  • 3,627
  • 3
  • 21
  • 35
0

If possible you can make the script so that it handles a portion of the wanted operations. Once it completes say 10%, you via AJAX call the script again to execute the next 10%. But there are circumstances where this is not an ideal solution, it really depends on what you are doing.

I used this method to create a web-based crawler which only ran on my computer for instance. If it had to do the operations at once it would time out as well. So it was split into 200 "tasks", each called via Ajax once the previous completes. Works perfectly, and it's been over a year since it started running (crawling?)

Jens
  • 1,302
  • 1
  • 13
  • 20