0

I have a json file and i want to insert all data to my database.

First i am trying to get contents from json file and store in array. It takes 20 seconds.

$strJsonFileContents = file_get_contents('C:\books.json');
$array = json_decode($strJsonFileContents);

After that i prepare the insert query

$query = $db->prepare("INSERT INTO product SET
webID = ?,
name = ?,
subtitle = ?,
title = ?,
description = ?,
ISBN = ?
.
.
.
");

I create a foreach loop and execute insert query.

foreach($array->Books as $book){
    $insert = $query->execute(array(
    $book->p2,
    $book->p3,
    $book->p4
    .
    .
    .
));

This foreach loops has 300.000 loop. It takes huge time to insert. - I didn't see finish, it made 40,000 additions in about 10 minutes.

What should I do to reduce this time to seconds? It is not a big problem for me to have 20 seconds to read from the file but inserting time it is a huge problem.

OSentrk
  • 58
  • 1
  • 1
  • 9
  • Are you using innoDB ? if so change it to myisam, use transaction() and end with commit. –  Feb 19 '20 at 21:28
  • 2
    @OSentrk: don't listen anyone who suggests INSERT INTO solutions. Google LOAD DATA INFILE, this way is x100 times faster. – user1597430 Feb 19 '20 at 21:31
  • 1
    convert the json file to csv and use LOCAL INFILE, that is much less programming effort. – nbk Feb 19 '20 at 21:47
  • Use transactions to speed it up. – Dharman Feb 19 '20 at 21:48
  • as @user1597430 said, consider using `LOAD DATA INFILE`, it's the best way to make mass insert in database. But you will have to convert you json file into csv file. At least you will save lots of minutes. – Lounis Feb 19 '20 at 22:14

1 Answers1

0

One way to reduce this is to create one query using the following syntax:

INSERT INTO `product` (`webID`, `name`, `subtitle`, `title`, `description`, `ISBN`)
VALUES (?,?,?,?,?,?),(?,?,?,?,?,?) .... // and so on

Using the PHP to assemble the VALUES portion of the query as you can have multiple sets of these. Then run the query ONE time.

Regardless of how you do this, it will take some time to prepare the query and then execute it. Inserting that many rows at once is not trivial.

Jay Blanchard
  • 34,243
  • 16
  • 77
  • 119