So I've been using Laravel's database migrations in a recent project and everything works perfectly, except for a "cities" table that has about 3.8 million rows. The following works as expected:
DB::table('cities')->insert([
'name' => 'Dublin'
]);
But when I add the additional 3.8 million rows to the above insertion array the artisan migrate command just fails/times out.
Am I missing something here or is there a better way to do it?
The file size of the cities migration is 365 MB which actually crashes Phpstorm (out of memory errors). I'm wondering if there's a way to split large db migration into smaller files?
PHP 7.2 Laravel 5.7 Docker/Laradock.