0

I have a confusion. I have a cron file (php) that handle different select/update/delete query on a database that have around 1 million row. What is the efficient method to ensure that all queries are run without fail. also the server is a shared server having 30 sec max php execution time.

Haren Sarma
  • 2,267
  • 6
  • 43
  • 72
  • `set_time_limit(0);` Unless the function is disabled, then you can't. – Charlotte Dunois Aug 14 '16 at 17:14
  • Possible duplicate of [Prevent nginx 504 Gateway timeout using PHP set\_time\_limit()](http://stackoverflow.com/questions/16002268/prevent-nginx-504-gateway-timeout-using-php-set-time-limit) – Martin Tournoij Aug 14 '16 at 23:09

1 Answers1

0

Divide the query in chunks and call the cron function with a param to get the limits.

task.php?f=0
task.php?f=1
task.php?f=2
[...]

And f param set the limit for the queries. Code a log file too.