1

I have a File with over 360.000 Rows, every line is a JSON.

i read the Json into Php Variables and insert it into two Tables

"movies" and "movies_details"

i'm using Laravel 5.4. My Problem is the Performance. so i can insert round about 35-40 rows in one minute with this code

$idMovie = DB::table('movies')
   ->insertGetId([
       'moviedb_id' => $id,
       'adult' => $adult,
       'created_at' => $dateTime,
  ]);
  DB::table('movies_title')
      ->insert([
          'movie_id' => $idMovie,
          'original_title' => $original_title,
          'created_at' => $dateTime,
      ]);

but i thing anyone know a faster and better way to insert this.

Thanks for your time.

Damlo
  • 132
  • 1
  • 17
  • MySQL supports multi insert like so `INSERT INTO movie_title (movie_id, original_title, created_at) VALUES(1, 'title', '2017-11-14'), (2, 'title', '2017-11-14')`.... and so on.. this might help https://stackoverflow.com/questions/29723865/how-to-insert-multiple-rows-from-a-single-query-using-eloquent-fluent – Raymond Nijland Nov 14 '17 at 16:59
  • Have you looked into the `chunk` method? – Kisaragi Nov 14 '17 at 17:27
  • @RaymondNijland yes i have seen this link. My Problem with this link is: they have only one table to insert there datas. but i need the inserted ID from the "Movie"-Table to assign it with the rows in "movie_details"-Table – Damlo Nov 14 '17 at 17:42

1 Answers1

0

you could use multiple insert (bulk insert) assuming you can assign your vars an array

as in flat sql

    insert into your_table (moviedb_id, adult, created_at)
    values ($id[0],  $adult[0], $dateTime[0]), 
            ($id[1],  $adult[1], $dateTime[1]),

            ......
            ......
            ($id[n],  $adult[n], $dateTime[n])

in this way with each insert you insert more rows and instead of 30 - 40 for minute you can insert 30*n - 40*n rows

so check for yout ORM or framework query manager for the equivalent function

ScaisEdge
  • 131,976
  • 10
  • 91
  • 107