0

I was requested to do a csv -> mysql insert on CSV Files .

They have 20k files of csv that contains from 100 to 100k lines of data .

So here is what i thought i can do :

On my machine i could join all files into 1 using a simple command :

 copy *.csv  combined.csv 

and then process that file and then using php fgetcsv process that file and insert it in the DB .

Or

I could write a php program that read a directory process each file and when done transfer it into another folder .

Dedicated server is available

Good Pc with 8gb of memory available

What do you suggest ?

Tarek
  • 3,810
  • 3
  • 36
  • 62
  • 1
    regardless of whether you combine the files or not, you want to import them via [`LOAD DATA INFILE`](http://stackoverflow.com/questions/3025648/import-csv-to-mysql) – Gordon Mar 06 '12 at 23:15
  • You should have left that as an answer :) – Tarek Mar 07 '12 at 00:28
  • 1
    no, because if that answers your question already, your question is a duplicate and should be closed ;) – Gordon Mar 07 '12 at 08:24

2 Answers2

3

Use LOAD DATA INFILE. Its better to insert data file by file.

LOAD DATA INFILE '/path/to/data.csv' 
INTO TABLE `table` 
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\n' 

Run in batch mode for 20k files. Put password in command to prevent it from asking.

#!/bin/bash

for f in /path/to/*.csv;
do 
    mysql -uroot -pPASSWORD DATABASE \
    -e "LOAD DATA INFILE '$f' 
        INTO TABLE `table` 
        FIELDS TERMINATED BY ',' 
        LINES TERMINATED BY '\n'"
    echo $f is imported ...
done;
Shiplu Mokaddim
  • 56,364
  • 17
  • 141
  • 187
0

It would be a much better idea to do it file by file. This allows for restart if something goes wrong, also, are all the files identical?

You can use a php script to generate sql commands and execute those .

Toby Allen
  • 10,997
  • 11
  • 73
  • 124