0

I have a large CSV file of 12000 rows and 90 columns.

I want to use the mysql load data infile query to upload it to my mysql database.

But I keep getting the error that my CSV has duplicates on the primary key.

I am sure that it does not have duplicates on the primary key.

What could be the problem?

here is my code

$sql = "LOAD DATA INFILE '/a_bysch_store (2).csv' INTO TABLE a_bysch"
. " FIELDS TERMINATED BY ','"
. " LINES TERMINATED BY '\r\n'"
. " IGNORE 1 LINES"; 

//Try to execute query (not stmt) and catch mysqli error from engine and php error
if (!($stmt = $mysqli->query($sql))) {
   echo "\nQuery execute failed: ERRNO: (" . $mysqli->errno . ") " . $mysqli->error;
}
CodeZombie
  • 5,367
  • 3
  • 30
  • 37
Victor Njoroge
  • 353
  • 2
  • 9
  • 22
  • 3
    load it in to something like excel then check if it does have duplicates. you could also remove the primary key key for testing –  Apr 29 '14 at 20:15
  • Possible duplicate of [mysql duplicates with LOAD DATA INFILE](http://stackoverflow.com/questions/1965001/mysql-duplicates-with-load-data-infile) – e4c5 Sep 13 '16 at 07:24

1 Answers1

0

Instead of LOAD DATA INFILE use LOAD DATA LOCAL INFILE. This converts duplicate key errors to warnings and the file should import with duplicates skipped. This is the same as using the IGNORE switch. You can probably use either one. I see you are using IGNORE 1 Lines, but that doesn't ignore duplicate keys.

Also, if you are using an autoincrement as a primary key, don't pass that value in the csv file.

Len_D
  • 1,422
  • 1
  • 12
  • 21