1

Part of my application copies files to a network share. Periodically, the network share times out when PHP does its copy() bit and the application dies with a fatal error (exceeded maximum execution time).

Is there a way to have it "give up" on the copy BEFORE it hits the maximum execution time so that it can throw an exception or give a nicer message to the user (and not clutter my error logs!)?

gazareth
  • 1,135
  • 10
  • 26
  • Heres an answer on how to catch a fatal error if you're interested: http://stackoverflow.com/questions/277224/how-do-i-catch-a-php-fatal-error – castis Jan 07 '15 at 16:19
  • possible duplicate of [How to catch the fatal error: Maximum execution time of 30 seconds exceeded in PHP](http://stackoverflow.com/questions/6861033/how-to-catch-the-fatal-error-maximum-execution-time-of-30-seconds-exceeded-in-p) – Jonan Jan 07 '15 at 16:20
  • 1
    I presume you want to gracefully abort the copy and your script prior to running out of time completely, in which case you might set an [`alarm`](http://php.net/manual/en/function.pcntl-alarm.php). – bishop Jan 07 '15 at 16:20
  • @Jonan - neither of those answer my specific issue. I do not want to "catch" the error (I know you cannot do this technically). Is there any alternative to copy() or some parameter for it that I am not aware of which allows you to set a timeout for that function separate to the general PHP max execution time. – gazareth Jan 07 '15 at 16:24
  • alarm() would work, but can be flaky / impossible depending on the exact thread model your server uses. I would prefer the async shell command interface such as popen() with stream_select()... – BadZen Jan 07 '15 at 16:24

2 Answers2

1

Instead of working around the timeout, perhaps figure out why the shares are timing out (or if the script is timing out) instead of writing files that will end up being corrupt, wasting space and potentially causing problems later. That being said, after you have figured it out, it might be useful to use a better tool than PHP's copy such as executing rsync, xcopy, or robocopy.

Rob W
  • 9,134
  • 1
  • 30
  • 50
  • Or heck, increase the script's execution time? [`set_time_limit`](http://php.net/manual/en/function.set-time-limit.php) – Mr. Llama Jan 07 '15 at 16:47
  • I like your thinking Mr. Llama... Or even disable the timeout altogether, since you'll never know what the file size is! – Rob W Jan 07 '15 at 16:48
  • This is a small internal app and the files are always small. Our network has issues and trying to resolve these would be a huge waste of time (because of how complex it is and how far removed from the systems and networks teams I am). Increasing time limit has no effect (since it's timing out on trying to "connect" rather than the copy itself) – gazareth Jan 07 '15 at 17:45
  • I'd suggest something in my answer then, instead of relying something as simple as PHP -- tools such as robocopy can keep retrying a file in a queue which can be triggered from PHP. – Rob W Jan 09 '15 at 19:23
0

You can fork, and execute the copy in the child process. In the main process, keep checking the time and kill the child process if limit is exceeded.

gontrollez
  • 6,372
  • 2
  • 28
  • 36