0

My application processes thousands of database entries and allows the user to export a certain part of the search result list.

Every Entry holds specific files and the user wants to download them in a compressed file (eg. zip). This works fine, but I am running most probably into max execution timeouts. I've tried for debugging reasons to simply adjust the max_execution_time for PHP at all and in my php script. Both did not omit the error.

Basically I have two questions:

  1. How to analyse the problem correctly? (Does it have to do with the max execution time)
  2. How to omit max execution timeouts? (Most probably it is this, because with smaller amounts of queries it works just fine)

Example code:

public function export(Requests\ExportRequest $request)
{
    $input = $request->all();

    foreach ($input['entries'] as $id) {
        $entry = Entry::findOrFail($id);
    }
}

EDIT:

Fyi, the problem was not the max_execution_time, even it did reduce the SQL queries massively. The problem was caused by the limitation of max_input_vars. The targeted entries were checked in a form and send to the export() function. Every checkbox had name=entries[], which caused that each entry had a seperate input value. You can read more about this problem here: Is there a limit on checked checkboxes in PHP form POST? (Consider also reading this specific answer)

Michael W. Czechowski
  • 3,366
  • 2
  • 23
  • 50
  • 1
    You can enable the mysql setting to log slow queries and analyze them later and you can use a queue to generate this and then send it to the user when its ready. – thefallen Feb 20 '19 at 12:59
  • uhmm usually im using chunk method from the eloquent to avoid hitting maximum execution time. – pinoyCoder Feb 20 '19 at 13:03

1 Answers1

3

You've run into the N+1 problem: you execute a separate query for every entry you want to receive. You can retrieve all the entries at once by writing a query that does that. I don't exactly know what happens after the code you posted, since the posted code doesn't really do anything. But running 1000 separate queries is orders of magnitude slower than running 1 big query.

public function export(Requests\ExportRequest $request)
{
    $input = $request->all();

    $allEntries = Entry::whereIn('entry_id', $input['entries'])->get();

    // All entries is a Collection with all the queried entries in it
}

My rule of thumb is that whenever max_execution_time comes into play, I've made a bad architectural decision. Maybe my solution solves the problem, maybe @TheFallen's does. But we need more code to analyse that.

Loek
  • 4,037
  • 19
  • 35
  • Good point, this helped to reduce the query processing, but did not fix my problem. This actually helped to debug, so it solves my question. The next point could be, that I am using POST to process the incoming entry ids. Is there a maximum payload of POST request parameters? – Michael W. Czechowski Feb 20 '19 at 13:40
  • The usual max size of a GET request is 8 kilobytes, but browsers can get in the way (Safari only allows 2kb for example). I don't know how many parameters you usually process, but 2kb should be more than enough. The problem is probably that the zipping of the entries takes a long time (zipping 6 files of about 10mb just takes long, you can only speed that up by throwing more CPU at it). It might be an idea to offload the whole process into a queue and send the user an email when their files are ready..? Still, we need more code to analyse it properly. – Loek Feb 20 '19 at 13:45
  • Sorry, it was a POST request. I've edited my comment. – Michael W. Czechowski Feb 20 '19 at 13:47
  • 1
    I will discuss it in another question, because it should not have to do with the sql processing. Thanks for helping! – Michael W. Czechowski Feb 20 '19 at 13:48
  • 1
    Even if remove all the zipping stuff, the code does not even execude. So, my guess would be to look for reducing the POST request payload. – Michael W. Czechowski Feb 20 '19 at 13:50