0

I am using php shell_exec to run a command from jQuery but it keeps timing out.

So my jquery is as follows.

var send = {
    url: $(this).data('url')
}

$.ajax({
    type: "POST",
    url: "<?php echo base_url(); ?>index.php/upload",
    data: send,
    //dataType: "json",
    //timeout: 8000,
    beforeSend: function() {

    },
    success: function(response) {
        console.log(response);
    },
    error: function () {
        alert('error');
    }
});

This then calls a function which runs shell_exec

function upload(){
    $output = shell_exec("wget {$_POST['url']} 2>&1");      
    echo "<pre>" . $output . "</pre>";
}

Ok this works fine if the url that is posted for wget get is pulling a small file but if wget is getting a file say over 20mb i get 500 (Internal Server Error)

Things i have checked so far all my php setting file upload limit max file size etc memory limit all i have set to the max.

The thing that i dont get is this worked fine locally but on my hosting i get this error, i have contact my hosting media temple and they have said it is out of their scope of support.

Any suggestion on how to debug further?

Gwenc37
  • 2,064
  • 7
  • 18
  • 22
user1503606
  • 3,872
  • 13
  • 44
  • 78

1 Answers1

0

What is happening is that some kind of reverse proxy - for example Apache balancer - is timing out, whether the PHP script times out or not. You can verify by creating a script such as

<?php sleep(60); ?>

and experimenting with different delays.

Anyway, upping the time limit would only appear to be a solution, since some browsers (e.g. IE8+) will timeout no matter what.

Usually this is solved by detaching the wget process and having a PHP page checking on the process status with a refresh of, say, 5..30 seconds. But many providers take an understandably dim view of detaching processes, since this might hog the memory of a system serving dozens of websites (=paying customers!). So you'd have to check it out with the provider whether the solution is acceptable.

If it is, you can check out " Asynchronous shell exec in PHP ".

Community
  • 1
  • 1
LSerni
  • 55,617
  • 10
  • 65
  • 107
  • hi Islerni thanks for the reply i am on a dedicated server so i shouldn't be hogging memory only from myself correct me if im wrong tho. ill try you method above and post back. – user1503606 Sep 21 '12 at 15:41
  • Ok great narrowing it down i tried sleep 10 20 30 40 50 all returned true but as soon as i went to sleep(60) i got the server 500 internel error... – user1503606 Sep 21 '12 at 15:46
  • You can further narrow checks by setting PHP timeout to, say, 5 seconds, and then running a `sleep(10)` (or better a shell_exec of sleep 10, for I'm not sure whether PHP sleep counts as execution time). That will cause a "true" PHP timeout, and a fatal error, but not a `500 Internal Server Error`. If it does, then you need to dig further; but if it doesn't, and you get a page such as this http://drupal.org/files/issues/ScreenShot003_0.png , then you can be sure it's not a PHP problem and you can get back to your provider asking for an explanation. – LSerni Sep 21 '12 at 20:53