2

Let's say I have table named groups with these data:

+----+---------+------+-------+
| id | user_id | name | color |
+----+---------+------+-------+
|  1 |       1 | foo  | green |
|  2 |       1 | bar  | red   |
|  3 |       1 | baz  |       |
|  4 |       2 | baz  | grey  |
|  5 |       3 | foo  | blue  |
|  6 |       3 | baz  |       |
|  7 |       4 | baz  | brown |
|  8 |       4 | foo  |       |
|  9 |       4 | qux  | black |
+----+---------+------+-------+

I'm going to read a csv file and convert it to an array of data like this:

[
    [
        'user_id'   => 2,
        'name'      => 'foo'
    ],
    [
        'user_id'   => 2,
        'name'      => 'bar'
    ],
    [
        'user_id'   => 2,
        'name'      => 'baz'
    ],
    [
        'user_id'   => 2,
        'name'      => 'qux'
    ],
    [
        'user_id'   => 2,
        'name'      => 'tux'
    ],
]

and insert only new data into database, and skip what already exist in database, in this example, group baz for user 2.

In Eloquent there are some useful methods like firstOrCreate() or findOrNew() that seems to be what I need, but these methods are for only one record, and if I use them I should run them per file line.

while($line = $this->decoder->decode($this->file) {
    $row = Group::firstOrCreate($line);
}

Is there any better solution with running less queries?

Behzadsh
  • 851
  • 1
  • 14
  • 30

2 Answers2

2

Use INSERT ... ON DUPLICATE KEY UPDATE

INSERT INTO table (user_id,name) VALUES(1, "test") ON DUPLICATE KEY UPDATE name=VALUES(name)

For Laravel check this question

Community
  • 1
  • 1
ariaby
  • 852
  • 7
  • 12
  • 1
    as I mentioned in my question, `firstOrNew()` and `firstOrCreate()` is useful if I had single record, but I'm looking for a way to do it for multiple records. – Behzadsh Aug 18 '14 at 14:39
1

You can just load the file directly into a table and ignore duplicate entries. All you need is a unique index on the column which should not have duplicates. Then you do

LOAD DATA INFILE 'file_name.csv'
IGNORE
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(user_id, @var)
SET name = do_some_voodoo(@var);

And that's it. Read more about this command here.

Test:

/*sample data*/
shell> cat test.csv
Name|Value
Anna|hello
Ben |Test
East|whatever
Anna|This line should be ignored
Bob |

/*creating destination table in database*/
mysql> create table phpload(a varchar(50) primary key, b varchar(50));
Query OK, 0 rows affected (0.03 sec)

/*Test script*/
<?php
// Create connection
$con=mysqli_connect("localhost","root","r00t","playground");

// Check connection
if (mysqli_connect_errno()) {
  echo "Failed to connect to MySQL: " . mysqli_connect_error();
}

$sql = "LOAD DATA LOCAL INFILE '/home/developer/test.csv' IGNORE INTO TABLE phpload FIELDS TERMINATED BY '|' (a, @b) SET b = UPPER(@b);";

$result = mysqli_query($con, $sql) or die(mysqli_error());

mysqli_close($con);
?>

/*executing*/
shell> php test.php

mysql> select * from phpload;
+------+----------+
| a    | b        |
+------+----------+
| Name | VALUE    |
| Anna | HELLO    |
| Ben  | TEST     |
| East | WHATEVER |
| Bob  |          |
+------+----------+
5 rows in set (0.00 sec)

Works absolutely fine.

fancyPants
  • 50,732
  • 33
  • 89
  • 96