1

I have a database with 40,000+ rows. I am trying to run a script for each row in the table as such:

$query = 'SELECT * FROM table';

$result = mysql_query($query);

while ($row = mysql_fetch_assoc($result)) {

//run PHP script here

}

The script may take about 10seconds to complete per row as there are some curl functions that take some time to load.

My question is how can I best run the script for every row in this table without it timing out or being slow in the browser? I would also like to run the script as a cron job but I'm not sure if this is possible? Quite new to all this, apologies if this seems trivial.

Matt
  • 136
  • 2
  • 10

4 Answers4

0

If php is not running in the safemode, you can use the function

set_time_limit(int)

This function increases the timelimit every time you call it. So you could call the function in your whileloop. If this won't work, check your serverconfig. Apache i.e. has additional timeout settings.

Have a look at http://www.php.net/manual/en/function.set-time-limit.php

chresse
  • 5,486
  • 3
  • 30
  • 47
  • Thanks. If I set the time limit and run the script, do I need to keep my browser open until it is completed? Is there a way to run the script on the server with no client interaction? – Matt Apr 13 '14 at 17:16
  • If you close your browser the script will be aborted. If you have serveraccess you can run it from the comandline (http://www.php.net/manual/en/features.commandline.php) – chresse Apr 14 '14 at 14:51
0

you can set maximum exicution time for your script as

ini_set('max_execution_time', 'time in second');

It will increase timeout of script and also you should use-

while ($row = mysql_fetch_array($result)) { } instead of while ($row = mysql_fetch_assoc($result)) { }

shashank
  • 566
  • 3
  • 10
  • 31
0

Run the script from the command line:

$ php myneverendingscript.php

Or use a cron job if it needs to run on a regular interval.

CodeZombie
  • 5,367
  • 3
  • 30
  • 37
0

if you are loading 40000+ rows, the issues you can face could be related to timeout or to maximum allowed memory that could be reached, based on the configuration of your server below is what you can do:

if you have access to the config file php.ini and you can change properties:

if you don't have access to php.ini and you are not running in safe mode, you can try using this command in the beginning of the script:

  • ini_set('memory_limit','64M');
  • ini_set('max_execution_time ','0'); or set_time_limit(0); (0 means unlimited and 0 is the default value if you are running your script from command line)

Finally, if you are using MySql you can consider using pagination and process X rows at a time, this will avoid reaching the Memory limit. Something like :

SELECT * FROM tbl LIMIT 0,1000; # then increment by 1000 -> LIMIT 1000, 2000 etc..

Regarding using a cron job, this could be easily done, take a look at this PHP - good cronjob/crontab/cron tutorial or book

Community
  • 1
  • 1
Mehdi Karamosly
  • 5,388
  • 2
  • 32
  • 50