0

I have to process a huge list and I'm trying loop with each item of this list separately to avoid get a PHP maximum execution time error:

function syncCgiu(form_data, response) {
    let promises = [];
    $.each(response.content, function(index, value){
        const data = form_data+'&action=sync_cgiu&lookup='+value;
        promises.push(
            $.post(ajaxurl, data, function(){
                console.log(value + 'imported');
            })
        );
    });

    Promise.all(promises).then(function() {
        console.log('finish import');
    });
});

At some point, the console start showing errors:

Failed to load resource: net::ERR_INSUFFICIENT_RESOURCES

Failed to load resource: the server responded with a status of 504 ()

And the PHP stop running the function.

marcelo2605
  • 2,734
  • 4
  • 29
  • 55
  • How many requests are you sending? Are you essentially DOS-ing your own server? Perhaps it would make more sense to process these in bulk rather than as individual requests? – David Jun 18 '21 at 10:49
  • @David The loop has more than 20k items. But my idea was process one item per time. Is not better? – marcelo2605 Jun 18 '21 at 10:51
  • Each new AJAX request adds the overhead of a HTTP request and handshake to your network and server. That's potentially a lot of extra compute. And AJAX is async so you're actually flooding the server with all the requests at once! What exactly are you trying to accomplish here - why do you need to process 20,000 items from a browser? From the minimal information we have, it sounds like this might be a bulk process better suited to a server-side job, possibly a background job triggered by cron or something like that. – ADyson Jun 18 '21 at 10:54
  • At very least you could batch the requests from AJAX - e.g. 500 or 1000 at a time or something. Right now you've gone from one extreme - processing all the items at once - to the other extreme - processing one at a time. As you've discovered, neither is likely to produce a great outcome, in an AJAX implementation. – ADyson Jun 18 '21 at 10:56
  • 1
    You're not processing them one-at-a-time, you're trying to process all of them *at the same time* - which is substantially worse than sending them all in single request. You have the right idea, but bad execution. You need an **ajax queue**. – freedomn-m Jun 18 '21 at 10:56
  • @ADyson it's an import application: the client load a CSV file and I'm using Ajax to run a PHP function that save the item on the db. One item per item. – marcelo2605 Jun 18 '21 at 10:57
  • Does this answer your question? [Sequencing ajax requests](https://stackoverflow.com/questions/3034874/sequencing-ajax-requests) - specifically this answer: https://stackoverflow.com/a/3035268/2181514 – freedomn-m Jun 18 '21 at 10:58
  • Got it @freedomn-m. I thought Promise wait each Ajax call finish before start another one. – marcelo2605 Jun 18 '21 at 10:59
  • Your Promise code says to wait until they've all finished, but you're firing them all at the same time. You need to fire one, wait for that to finish, then fire the next. [This answer](https://stackoverflow.com/a/3035268/2181514) provides a handy ready-built method/plugin to do just that. And you can add a progress indicator as it goes. – freedomn-m Jun 18 '21 at 11:00
  • See answer below...also do you actually _need_ to insert them via PHP? Or could the CSV be bulk loaded directly into the SQL db? Most DB engines have a bulk load feature, e.g. LOAD DATA in MySQL. (If you need to transform/adjust the data substantially before saving it then that might not work for you, but if it can be inserted as-is then it's much more efficient. – ADyson Jun 18 '21 at 11:02
  • 1
    *save them item on the db, one per item* - sounds like an XY problem - depending on your backend, there should be some sort of `BULK UPDATE` db action which will save the entire file extremely (or at least relatively) quickly. – freedomn-m Jun 18 '21 at 11:03

1 Answers1

4

From a comment on the question above...

How many requests are you sending?

More than 20k.

You're DOS-ing your own server. Having a single page send tens of thousands of requests in rapid succession is a bad thing. This should be bulk-processed with a single request. In the question you mention:

I'm trying loop with each item of this list separately to avoid get a PHP maximum execution time error

This is because waiting for tens of thousands of records to be processed is also not ideal. Instead, process them offline. Consider a scenario as follows:

  • Upload all of the data to be processed in a single request. The data is saved and a response returned to the user indicating that processing has begun.
  • Have an off-line process (a CRON job or some background process on the server) which monitors the "save location" (a database? a directory?) for new records to process. When it sees these new records, it begins processing them.
  • Upon completion, some flag is saved somewhere indicating that processing is done or the user is otherwise notified.
  • The user can view/download results when they become available.

As an example, consider writing all of the records immediately to a database. In that table include a column indicating if the records have been processed. Your CRON job or background process simply checks the table periodically for unprocessed records and processes them. As each one completes, the flag is updated to indicate completion. At any time the user can check the progress of their uploaded records.

Basically, don't rely on HTTP communication alone for large data processing needs. Too many requests or too large requests will both get you stuck. Move the large processes offline and keep the web application responsive to the user.

David
  • 208,112
  • 36
  • 198
  • 279