My application processes thousands of database entries and allows the user to export a certain part of the search result list.
Every Entry
holds specific files and the user wants to download them in a compressed file (eg. zip). This works fine, but I am running most probably into max execution timeouts. I've tried for debugging reasons to simply adjust the max_execution_time
for PHP at all and in my php script. Both did not omit the error.
Basically I have two questions:
- How to analyse the problem correctly? (Does it have to do with the max execution time)
- How to omit max execution timeouts? (Most probably it is this, because with smaller amounts of queries it works just fine)
Example code:
public function export(Requests\ExportRequest $request)
{
$input = $request->all();
foreach ($input['entries'] as $id) {
$entry = Entry::findOrFail($id);
}
}
EDIT:
Fyi, the problem was not the max_execution_time
, even it did reduce the SQL queries massively. The problem was caused by the limitation of max_input_vars
. The targeted entries were checked in a form and send to the export() function. Every checkbox had name=entries[]
, which caused that each entry had a seperate input value. You can read more about this problem here: Is there a limit on checked checkboxes in PHP form POST? (Consider also reading this specific answer)