13

I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.

Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.

Community
  • 1
  • 1
whostolemyhat
  • 3,107
  • 4
  • 34
  • 50
  • Probably best option here would be to use [shell_exec()](http://lv.php.net/manual/en/function.shell-exec.php) to run it in the background and also ignoring timeout settings. – arma May 10 '11 at 08:35
  • Plz dont forget to accept an answer if it resolved your issue. – Deepu S Nath May 19 '11 at 12:09
  • 1
    Running PHP on windows? That sounds like a bad idea. Then again, so was running IE on a mac. – Xeoncross May 20 '11 at 16:03
  • Do a `phpinfo()` on your server. See if there is a value in the section *Scan this dir for additional .ini files*. If so, let me know. – Salman A May 21 '11 at 09:55
  • @Salman 'Scan this dir' is set to 'none' – whostolemyhat May 23 '11 at 08:45
  • @What: hmmm, I am out of answers then. One suggestion for you is to zip the directory using *fast* algorithm instead of *best*. And do not compress already compressed file such as *.jpg, .gif, .zip*; just store them. – Salman A May 23 '11 at 09:17
  • PHP on windows sucks. But newer PHP version 5.3 is far better. There is a huge memory loss when running PHP on windows. I have seen a server which have the same issue and we recommended and moved successfully to Linux. You can see the number of php.exe running on the taskmanager may have increased and that causes none of the script to run . So better move from Windows. – Hari K T May 23 '11 at 09:20

7 Answers7

14

If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).

For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:

php_value max_execution_time 300

where 300 is the number of seconds you wish to set the maximum execution time for a php script.

There is also another way by using ini_set function in the php file

eg. TO set execution time as 5 second, you can use

ini_set('max_execution_time', 300); //300 seconds = 5 minutes

Please let me know if you need any more clarification.

Joe McBride
  • 3,789
  • 2
  • 34
  • 38
Deepu S Nath
  • 1,164
  • 1
  • 17
  • 45
  • Unfortunately, these are IIS servers so htaccess files won't work :( – whostolemyhat May 10 '11 at 09:01
  • 7
    Okay no probs.. Thank you for the response. Hope you got the issue already resolved by set_time_limit(0) method suggested.. if not you can still try ini_set() method. – Deepu S Nath May 10 '11 at 09:08
  • 1
    Even if it was possible to use .htaccess, I would not advise to use it to set configuration for single script. Limit needs to be changed only for backup script rather than for whole page. – binaryLV May 11 '11 at 07:10
  • still no joy with either the ini_set or set_time_limit methods. – whostolemyhat May 20 '11 at 18:06
12

set time limit comes to mind, but may still be limited by php.ini settings

set_time_limit(0);

http://php.net/manual/en/function.set-time-limit.php

bumperbox
  • 10,166
  • 6
  • 43
  • 66
10

Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.

If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)

Berry Langerak
  • 18,561
  • 4
  • 45
  • 58
  • the hosting provider takes daily backups; this script is for clients to take their own backup copies without having to fiddle with FTP – whostolemyhat May 23 '11 at 15:14
  • 1
    +1 best answer here. Do not create a web page to do backups. use cron or scheduled tasks. – Byron Whitlock May 23 '11 at 19:30
  • @What; still, using a browser to request a back-up script is usually a very bad idea, it's too error prone. Call the hosting provider and ask them for possibilities on cronjobs or scheduled tasks. – Berry Langerak May 24 '11 at 07:18
6

You can use the following in the start of your script:

<?php
if(!ini_get('safe_mode')){ 
        set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>

And at the end of the script use flush() function to tell PHP to send out what it has generated.

Hope this solves your problem.

Tareq
  • 1,999
  • 2
  • 28
  • 57
3

Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.

Basically, I would do this at the beginning of your script:

set_time_limit(0);
ignore_user_abort(true);

This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.

In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.

BenMorel
  • 34,448
  • 50
  • 182
  • 322
2

Try the function:

set_time_limit(300);

On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:

max_execution_time = 300

To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.

Salman A
  • 262,204
  • 82
  • 430
  • 521
1

Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.

Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.

gnur
  • 4,671
  • 2
  • 20
  • 33