0

So I've been using Laravel's database migrations in a recent project and everything works perfectly, except for a "cities" table that has about 3.8 million rows. The following works as expected:

DB::table('cities')->insert([
    'name' => 'Dublin'
]);

But when I add the additional 3.8 million rows to the above insertion array the artisan migrate command just fails/times out.

Am I missing something here or is there a better way to do it?

The file size of the cities migration is 365 MB which actually crashes Phpstorm (out of memory errors). I'm wondering if there's a way to split large db migration into smaller files?

PHP 7.2 Laravel 5.7 Docker/Laradock.

Donal.Lynch.Msc
  • 3,365
  • 12
  • 48
  • 78

3 Answers3

1

i would consider it doing it in a Job and run it on a redis queue.

so just write a simple command that dispatches the job. Also i would suggest you writing the data in chunks like in 1000er chunks :)

addi2113
  • 144
  • 8
  • also you should not do inserting of data in a migration file. If just testing data you should use DataSeeder for it :) – addi2113 Oct 28 '18 at 13:17
1

First of all you can out records in two seeders

You have to raise your memory limit in php/php.ini setting

How to assign more memory to docker container

Farid shahidi
  • 318
  • 4
  • 9
1

Use of job is more preferable in this case, which could use chunk of data to insert in batch and like addi2113 has explained, you should use seeder if that's for testing environment.

user10128333
  • 156
  • 4