2

I have a simple javascript function like so:

$(document).ready(function(){
    var request = $.ajax({
        url: "read_images.php",
        type: "GET",
        dataType: "html"
    });
    request.done(function(msg) {
        $("#mybox").html(msg);
        document.getElementById('message').innerHTML = '';
    });
    request.fail(function(jqXHR, textStatus) {
        alert( "Request failed: " + textStatus );
    });
});

The php script it is calling loops on the contents of a folder, runs some checks, and returns a response. The script is as follows:

//Get all Images from server, store in variable
$server_images = scandir('../images/original');

//Remove first 3 elements, which are not correct
array_shift($server_images);
array_shift($server_images);
array_shift($server_images);

$j = 0;
for($i=0;$i<count($server_images) && $i<3000;$i++) {
     $server_image = $server_images[$i];

    //Make sure that the server image does not have a php extension
    if(!preg_match('/.php/',$server_image)) {

    //Select products_id and name from table where the image name is equal to server image name
    $query = "SELECT `name`
            FROM `images`
            WHERE `name` = '$server_image'";
    $mro_images = $db->query($query);
    $mro_images_row = $mro_images->fetch();
    $mro_image = $mro_images_row['name'];

    //If no results are found
    if(empty($mro_image)) {
        $images[$j] = $server_image;
        $j++;
    }
}
}

It works if the loop is restricted to 2000 iterations but if I try to do e.g. 3000 iterations the result is:

HTTP/1.1 500 Internal Server Error 31234ms

I've tried increasing the php execution limit, but this didn't have any effect as, after contacting my host:

Unfortunately in our environment we don't have any way to increase the loadbalancer timeout beyond 30 seconds

Therefore: How can I restructure this code to avoid hitting the execution time limit?

AD7six
  • 63,116
  • 12
  • 91
  • 123
Tigerman55
  • 233
  • 10
  • 20
  • 2
    IF you run the PHP page in your browser I imagine you still get an error. Therefore, I'd suspect, it's the PHP not the javascript... – Lee Taylor May 21 '14 at 13:11
  • 3
    `HTTP/1.1 500 Internal Server Error 31234ms` sounds suspiciously like your php script is timing out as the default timeout for a php script is 30s. Do you really want a php script running for that long? Sounds like whatever it is doing should be done differently. – AD7six May 21 '14 at 13:12
  • @AD7six I think you are correct, but how do I change the time out to a longer period of time? – Tigerman55 May 21 '14 at 13:13
  • 1
    Please describe what it is you're doing in that 30+ seconds. There could well be a better way to do achieve this. – Lee Taylor May 21 '14 at 13:14
  • If it's not a database timeout, you can increase your script timeout for php with: ini_set('max_execution_time', 500); //500 seconds – nixkuroi May 21 '14 at 13:15
  • i hate people that just refuse to give you the answer. just let him learn his own way dont be a dick. – I wrestled a bear once. May 21 '14 at 13:19
  • @nixkuroi My PHP script still crashed after thirty seconds even after pasting that at the top of my PHP page. Do I need to put it in a different location? – Tigerman55 May 21 '14 at 13:21
  • @AD7six I am trying to loop through several thousand images that exist in FTP and check if they exist in the database - if they don't it generates their location and allows the administrator to delete them. Is there a better way to do it than the code printed above? – Tigerman55 May 21 '14 at 13:23
  • 1
    @Tigerman55 you need to post your PHP code. The JS you posted is irrelevant to the question. JS will never cause a 500 as that is a server error and JS is a client language. – I wrestled a bear once. May 21 '14 at 13:23
  • @LeeTaylor I am selecting an image name from a database table within phpMyAdmin with a query within the for loop. I can post the code if you'd like. – Tigerman55 May 21 '14 at 13:24
  • @Tigerman55 Without giving or seeing any details I can say categorically: yes. For example logically move the loop to javascript - you ask for the results for 100 images, and when you get the answer you ask for the results of the next 100 images. If you choose to fix the root problem (great!) you need to show your php code in the question - be careful to not ask a question that is too broad (I.e. don't _just_ put your comment as a question ala "how do I make this better?"). – AD7six May 21 '14 at 13:25
  • @Adelphia I posted the code within the loop, is that what you meant? – Tigerman55 May 21 '14 at 13:29
  • @Tigerman55 - That doesn't really explain why it's taking 30+ seconds, does it? – Lee Taylor May 21 '14 at 13:30
  • woops, i must have missed that part – I wrestled a bear once. May 21 '14 at 13:32
  • @Adelphia I added it after you asked me to, so you didn't actually miss it. – Tigerman55 May 21 '14 at 13:34
  • @AD7six I think you have the correct logic I just do not have the knowledge of JavaScript to be able to do that. I may try researching it at least. Could you recommend any helpful articles on it? – Tigerman55 May 21 '14 at 13:35
  • I talked to the hosting company and they said "Unfortunately in our environment we don't have any way to increase the loadbalancer timeout beyond 30 seconds :(", so I guess that explains why the increase in execution time does not work. – Tigerman55 May 21 '14 at 13:42
  • @Tigerman55 Based on what you've said I've outlined the principle I feel you should follow below in an answer. One point I haven't addressed is that you're issuing one query per image - so 2k images = 2k sql queries. That's bad. You can instead issue only one query for _all_ images by refactoring your php code appropriately. It would be wise to rephrase the question given that your problem is not possible to solve by increasing the time limit. – AD7six May 21 '14 at 14:06

3 Answers3

4

The below code indicates the basic logic to follow. It isn't tested code and should not be taken as a drop in code example.

Use a javascript loop

Instead of making a slow process slower - write your JavaScript to ask for smaller chunks of data in a loop.

I.e. the js could use a while loop:

$(document).ready(function(){
    var done = false,
        offset = 0,
        limit = 20;

    while (!done) {
        var url = "read_images.php?offset=" + offset + "&limit=" + limit;

        $.ajax({
            async: false,
            url: url
        }).done(function(response) {

            if (response.processed !== limit) {
                // asked to process 20, only processed <=19 - there aren't any more
                done = true;
            }

            offset += response.processed;
            $("#mybox").html("Processed total of " + offset + " records");

        }).fail(function(jqXHR, textStatus) {

            $("#mybox").html("Error after processing " + offset + " records. Error: " textStatus);

            done = true;
        });
    }

});

Note that in the above example the ajax call is forced to be syncronous. Normally you don't want to do this, but in this example makes it easier to write, and possibly easier to understand.

Do a fixed amount of work per php request

The php code also needs modifying to expect and use the get arguments being passed:

$stuff = scandir('../images/original');

$offset = $_GET['offset'];
$limit = $_GET['limit'];

$server_images = array_slice($stuff, $offset, $limit);

foreach($server_images as $server_image) {
    ...
}
...

$response = array(
    'processed' => count($server_images),
    'message' => 'All is right with the world'
);

header('Content-Type: application/json');
echo json_encode($response);
die;

In this way the amount of work a given php request needs to process is fixed, as the overall amount of data to process grows (assuming the number of files in the directory doesn't grow to impractical numbers).

Community
  • 1
  • 1
AD7six
  • 63,116
  • 12
  • 91
  • 123
  • run this in your existing php code `print_r(array_slice($server_images, 0, 20)); die;` and change the numbers/look at the php.net description for that function. – AD7six May 21 '14 at 14:14
  • Could you expound on what the $response array and json_encode is doing? – Tigerman55 May 21 '14 at 14:23
  • I recommend you create a php file with just that code in it and hard coded data and call it. You'll find it's pretty obvious. Alternatively if you genuinely can't figure out what a specific function is doing _look_ for resources about that function and if you still can't find information, ask a new _specific_ question here on SO. – AD7six May 21 '14 at 14:25
  • I think I got it to work properly with this example. However, it stops once it reaches 1000. Do you know why? At that point it returns NaN. Not sure what's going on.. – Tigerman55 May 21 '14 at 15:15
  • 1
    That sounds like [a new question](http://meta.stackexchange.com/questions/43478/exit-strategies-for-chameleon-questions). – AD7six May 21 '14 at 15:16
2

If everything works with 2000 iterations for 3000 iterations try upping the time limit to allow php to execute longer. But under normal circumstances this is not a good idea. Make sure you know what you are doing and have a good reason for increasing the execution time.

set_time_limit ( 60 );

http://www.php.net/manual/en/function.set-time-limit.php

Also this could be due to the script exhausting the amount of memory. Create a file with the phpinfo function in it and then check the value for the memory_limit.

<?php phpinfo(); ?>

Then you can increase the limit by htaccess file. But again make sure you want the script to consume more memory. Be careful.

ini_set('memory_limit', '128M'); #change 128 to suit your needs
dmullings
  • 7,070
  • 5
  • 28
  • 28
  • I tried this, but it still crashed at 30 seconds. I just pasted the code at the top of my page and saved it. – Tigerman55 May 21 '14 at 13:28
  • what is the response that you get from the page if you go directly to it in the web browser - not an ajax request. It sounds like if it is isn't timing out it is being depleted of some resource (probably memory). – dmullings May 21 '14 at 13:33
  • If I go to a web browser and try it, it will try loading for about 30 seconds then give me an alert saying: Request failed: error. – Tigerman55 May 21 '14 at 13:38
  • I meant if you go to read_images.php directly in the web browser. There should be no javascript alerts on a exclusively php file. – dmullings May 21 '14 at 13:40
0

Your count($server_images) is probably resulting in an infinite loop.

If count() returns 0, your for loop will never end. So you need to check that first.

//Get all Images from server, store in variable
$server_images = scandir('../images/original');

//Remove first 3 elements, which are not correct
array_shift($server_images);
array_shift($server_images);
array_shift($server_images);

$j = 0;

if(count($server_images) > 0){    
    for($i=0;$i<count($server_images) && $i<3000;$i++) {
      //Do something
    }
}
Pataar
  • 651
  • 8
  • 14