This question has been asked a few times and I have not found a satisfying answer to my specific scenario. I am executing a long running php script (from a symfony2 controller) that accesses a web service gets data and pushes it into a database. Sometimes, at random, I get these fatal error exceptions irrespective of what execution time limits I set -
Fatal Error: Maximum execution time of 120 seconds exceeded
Code
public function webCallAction()
{
set_time_limit(120);
$count = 0;
while($count < 100) {
$count = $count + 1;
try{
$webAPIUtil = new WebAPIHelper();
//fatal timeout error here on random occassions
$dataList = $webAPIUtil ->getData();
}
catch(\FatalErrorException $e1)
{
//Not working
}
foreach ($dataList as $dataItem) {
//Insert into database using doctrine
}
}
}
These seem to be temporary errors such as curl_timeouts etc (from within third party libraries). I would like to be able to retry the call by handling the fatal error exception. Is there a right way to do this in Symfony2?
An answer to one of the questions was -
PHP doesn't provide conventional means for catching and recovering from fatal errors. This is because processing should not typically be recovered after a fatal error. String matching an output buffer (as suggested by the original post the technique described on PHP.net) is definitely ill-advised.
But again, this seems to be a fatalerror thrown because of timeout in connection. Surely we need a retry mechanism when we catch the fatal error timeout exception.
Also, would it make sense if I put the code to be run from a command line as a script - described here - http://symfony.com/doc/current/cookbook/console/console_command.html