1

The production server has CentOS 6.4 with Plesk 11.0.9. It is a dedicated server.

Server configuration:
* max_execution_time is set to 500 from Plesk and it shows up correctly in phpinfo

Problems:
* I get 500 Internal Server Error after 30 seconds of running any script

Is there any way I get a workaround this problem?

LE:
* the script I want to do this downloads from a remote location a file to user's browser. It might require even hours to complete the operation (for very slow internet connection clients). I really need this. And one more thing, the user must not see the source url.

machineaddict
  • 3,216
  • 8
  • 37
  • 61
  • 1
    Check your error logs to see exactly why it's failing – jszobody Nov 22 '13 at 13:34
  • If you're querying database, it could aswell be the database request timeout. And as a side note, I know this is very tempting and all 'especially when uploading files for example), but if you've got a web script running for more than 30 seconds, then it should probably be whether optimized whether replaced by a non-web script... Or at leat the timeout extension should be altered only with this script, not for your whole site. – Laurent S. Nov 22 '13 at 13:39
  • @Bartdude: I have edited the question to let you know what the script does – machineaddict Nov 22 '13 at 13:42
  • Depending on how you're downloading the file, you could be hitting your memory limit. – keithhatfield Nov 22 '13 at 13:53
  • 1
    Well, as I said, you're probably not doing it as it should. You definitely should delegate the download task to a non-web script, and warn the user (through mail or whatever) when the file is ready for him to download. Especially if it can take hours... – Laurent S. Nov 22 '13 at 13:54
  • @Bartdude:If the user downloads the file through my script, it will take x seconds to download. But if I download the file first and then inform the user, it will take x seconds from the first request plus y seconds to actually download it. It is unacceptable. – machineaddict Nov 22 '13 at 14:00
  • Then supposing you can do it (meaning this remote location is accessible from server-side to a protocol allowing remote opening), you would need to open the file server side in order to read it and send it back to the user, which will cause memory problems as suggested by dleiftah in previous comment, in opposition to the timeouts problems you've got. Extending therequest timeout time to several hours is anyway a very very bad idea. – Laurent S. Nov 22 '13 at 14:09
  • @dleiftah and @Bartdude: the script basically `fopen`s the url, read chunks of 8192 bytes and echo that out. That a look [here](http://pastebin.com/16m0DWJg) – machineaddict Nov 22 '13 at 14:38
  • @dleiftah: how can I check how much memory it uses at certain point? – machineaddict Nov 22 '13 at 15:23

1 Answers1

3

I think you run Fast CGI. The CGI process has a maximum execution time. In fastcgi its the idle-timeout. After that you get an 500 Error because the backend is not responding in that time.

I don't know which module you use. But normally you can set the idle-timeout in your configuration.

mod_fastcgi idle timeout not work

Community
  • 1
  • 1
René Höhle
  • 26,716
  • 22
  • 73
  • 82
  • 1
    The server was running with `Fast CGI`. I switched it to `Apache` and now it seems to work. A simple script like `sleep(70); phpinfo();` now returns the output. Thanks! – machineaddict Nov 22 '13 at 15:20