From a comment on the question above...
How many requests are you sending?
More than 20k.
You're DOS-ing your own server. Having a single page send tens of thousands of requests in rapid succession is a bad thing. This should be bulk-processed with a single request. In the question you mention:
I'm trying loop with each item of this list separately to avoid get a PHP maximum execution time error
This is because waiting for tens of thousands of records to be processed is also not ideal. Instead, process them offline. Consider a scenario as follows:
- Upload all of the data to be processed in a single request. The data is saved and a response returned to the user indicating that processing has begun.
- Have an off-line process (a CRON job or some background process on the server) which monitors the "save location" (a database? a directory?) for new records to process. When it sees these new records, it begins processing them.
- Upon completion, some flag is saved somewhere indicating that processing is done or the user is otherwise notified.
- The user can view/download results when they become available.
As an example, consider writing all of the records immediately to a database. In that table include a column indicating if the records have been processed. Your CRON job or background process simply checks the table periodically for unprocessed records and processes them. As each one completes, the flag is updated to indicate completion. At any time the user can check the progress of their uploaded records.
Basically, don't rely on HTTP communication alone for large data processing needs. Too many requests or too large requests will both get you stuck. Move the large processes offline and keep the web application responsive to the user.