5

I tried the following script:

LOAD DATA LOCAL INFILE 'myfile.csv'
    REPLACE INTO TABLE `mydb`.`mytable`
    CHARACTER SET latin1 FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
    LINES TERMINATED BY '\r\n'
    IGNORE 1 LINES (`field1`, `field1`, `field1`, `field1`, `field1`, `field1`);

when I use a file of 500K records it works, but when I try a csv file of 4 million record it returns:

Query OK, 0 rows affected (2.79 sec) 
Records: 0  Deleted: 0  Skipped: 0  Warnings: 0

And of course nothing will be added in 2.70 secs!

My RAM is 4GB and my input file (the large one) is 370MB.

Can anyone suggest a solution?

KV Prajapati
  • 93,659
  • 19
  • 148
  • 186
Afshin Moazami
  • 2,092
  • 5
  • 33
  • 55

1 Answers1

8

It's possible that the line endings in the large file are not '\r\n'.

Change the LINES TERMINATED BY '\r\n' format to '\n'.

Phil
  • 2,392
  • 18
  • 21
  • what to do in a situation where you could get either `\r\n` or `\n`, and can't predict it in advance? (see http://stackoverflow.com/questions/10935219/mysql-load-data-infile-works-but-unpredictable-line-terminator) – SDC Jun 07 '12 at 16:14
  • Conversely change `\r\n` to `\r`. Adding `LINES TERMINATED BY '\r'` parameter to my query worked taking the 0 count to the expected 9,999 – SushiGuy May 01 '17 at 23:15