0

I have a PHP script that runs once a day, and it takes a good 30 minutes to run (I think). Everything there is a safe & secure operation. I keep getting the 500 error after about 10~15 minutes of it. However I can't see anything in the logs etc. so I'm a bit confused.

So far the things I set up as "unlimited" are:

  • max_execution_time
  • max_input_time
  • default_socket_timeout

Also set these to obscenely high numbers just for this section (the folder in which the script runs)

  • memory_limit
  • post_max_size

The nature of this script is a SOAP type API that imports thousands of rows of data from a 3rd party URL, puts them into a local MySQL table, and then downloads images attached with each and every row, so the amount of data is significant.

I'm trying to figure out what other PHP variables etc. I'm missing in order to get this to complete through the whole thing. Other PHP vars I have set:

  • display_errors = On
  • log_errors = On
  • error_reporting = E_ALL & ~E_NOTICE & ~E_WARNING
  • error_log = "error_log"
jeffkee
  • 5,106
  • 12
  • 44
  • 76
  • I had a problem similar to this for a while I tried to execute in the browser (because the ability to execute from command line wasn't installed) but once it was setup I had no problems. From your question I am not sure where you are running it - browser or command line so thought I'd leave a comment. – martincarlin87 Nov 10 '11 at 10:02
  • Right now I don't even have it as a cron job yet. I'm testing it through the browser. – jeffkee Nov 10 '11 at 10:02
  • That's not the point. Dear lord - how many unpractical nerds who try to argue over which language is better on Stack Overflow to switch a god damned lightbulb? None, they'll be arguing on how to invent a new light source instead of getting another lightbulb... – jeffkee Nov 10 '11 at 10:26

4 Answers4

3

Try to use PHP Command-Line Interface (php-cli) to do lengthy task. Execution time is infinity in command line unless you set it / terminate it. Also you can set schedule by cron job.

Raptor
  • 53,206
  • 45
  • 230
  • 366
3

Run it from command line with PHP (e.g. php yourscript.php) and this error shouldn't occur. Also it's not a good idea to use set_time_limit(0); you should at most use set_time_limit(86400). You can set a cron job to do this once per day. Just make sure that all filepaths in the script are absolute and not relative so it doesn't get confused.

Compiling the script might also help. HipHop is a great PHP compiler, then your script will run faster, use less memory, and can use as many resources as it likes. HipHop is just very difficult to install.

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055
Alasdair
  • 13,348
  • 18
  • 82
  • 138
  • "it is a SOAP type API that imports thousands of rows of data". I would expect that the network is the bottleneck in this case, so hiphop would offer very little benefit. – bumperbox Nov 10 '11 at 10:22
3

There are three timeouts:

  1. PHP Level: set_time_limit
  2. Apache Level: Timeout
  3. Mysql Level: Mysql Options

In your case seems like the Apache reached its timeout. In such situation it is better to use PHP CLI. But if you really need to do this operation in real-time. Then you can make use of Gearman through which you will achieve true parallelism in PHP.

If you need simple solution that trigger your script from normal HTTP request (Browser->Apache), you can run your back-end script (CLI script) as shell command from PHP but 'asynchronously'. More info can be found in Asynchronous shell exec in PHP

Community
  • 1
  • 1
Laith Shadeed
  • 4,271
  • 2
  • 23
  • 27
  • The apache option - I can override it? Can I put it under .htaccess? Or should I do it in WHM? – jeffkee Nov 10 '11 at 10:42
  • When I put Timeout 3600 in the .htaccess, it caused another internal server error :( I had this override on the WHM section but I wonder if I can limit it to only this domain of mine? I run a dedicated server. – jeffkee Nov 10 '11 at 10:49
  • In fact you can do that, but I don't recommend it. You don't want to have long running scripts which will eat your server memory and have many apache processes waiting for such long scripts. The best to go is to use PHP CLI. However if you want to trigger the script from a normal HTTP request (Browser/Apache) you can by run 'asynchronous-like' shell exec from PHP. More info can be found in http://stackoverflow.com/questions/222414/asynchronous-shell-exec-in-php – Laith Shadeed Nov 10 '11 at 10:58
  • I think the issue was the timeout - I had my server admins change the Timeout directive in httpd.conf on that account and it was fine. There's no other functions that run on there that's too long (other than my own stuff accessed by a secure hashtag) so the tiemout is not an issue for this one. – jeffkee Nov 14 '11 at 23:00
0

If the execution time is a problem, then maybe you should set the max_execution time using set_time_limit function inside the script:

set_time_limit(0);

I would also invoke the script on the command line using php directly, instead of through apache. In addition, print out some status messages, and pipe them into a log.

I suspect that your actual problem is that the script chokes on bad data somewhere along the line.

Gustav Bertram
  • 14,591
  • 3
  • 40
  • 65
  • I already have that command set up as well. If it's an error as you describe, wouldnt' it show up when I run it through teh browser? display_errors = On log_errors = On error_reporting = E_ALL & ~E_NOTICE & ~E_WARNING error_log = "error_log" – jeffkee Nov 10 '11 at 10:07
  • Obviously it's not showing up, or you would have seen it. It may be a logic error, which is why you should print regular status messages. Then run through the command line, and see what you get. – Gustav Bertram Nov 10 '11 at 10:13
  • I'll check the logs again... each data entry does log the ID # etc. but last I checked they just stop as if it's over. – jeffkee Nov 10 '11 at 10:27
  • So take a look at the entry just before and just after it stops. Then see what they will do to the code. – Gustav Bertram Nov 10 '11 at 10:31