2

How can I use Laravel's MyModel::insert(), DB::insert() methods for large SQL queries without running into memory issues?

I'm attempting to execute around 500 inserts of 1000 items.

It would be convenient to have the query builder each time convert an array with the items to SQL. However using MyModel::insert() or DB::insert() repeatedly runs out of memory.

The only low-memory workaround was to convert the array to SQL and use DB::statement() or DB::getPdo()->exec().

In pseudo code:

  1. This works as expected:
for (many times) {
  DB::getPdo()->exec('a large insert into ... query');
}

Memory usage stays at around 11 MB after each insert.

  1. This throws Allowed memory size of ... bytes exhausted exception
DB::disableQueryLog();
for (many times) {
  MyModel::insert($large_array);
}

For the first few inserts memory usage is also at 11 MB but then it quickly grows with each iteration of the loop, eventually reaching 1 GB.

Following the advice here https://stackoverflow.com/a/18776710/17981656 I disabled query logging (DB::disableQueryLog()) but it didn't help.

Context:
Laravel 8.83.9
PHP 8.1
macOS 12.3.1
I'm running the code with php artisan serve

Mateusz
  • 2,340
  • 25
  • 24

1 Answers1

3

Other developers have found disabling Query logs and unsetting/disabling the event dispatcher on the DB connection helpful. I.e:

DB::disableQueryLog();

$dispatcher = DB::connection()->getEventDispatcher();
DB::connection()->unsetEventDispatcher();

for (many times) {
    MyModel::insert($large_array);
}

DB::enableQueryLog();
DB::connection()->setEventDispatcher($dispatcher);

Resource: Query builder memory leak on large insert #27539

steven7mwesigwa
  • 5,701
  • 3
  • 20
  • 34