0

I have a csv file more than one lakhs ( 100,000 ) records. when I'm using these file for inserting records into my database it will be take more than half an hour. Is there any other way increase the efficiency and speed for these process or any frameworks are available?

Professor Abronsius
  • 33,063
  • 5
  • 32
  • 46
Ben10
  • 3,221
  • 2
  • 34
  • 61
  • How often do you have to add these records, and how are you currently doing it? – Tim Biegeleisen Oct 30 '15 at 07:24
  • how are you trying to insert using the csv? If you are using phpmyadmin you can import a csv into a table, and it wouldn't take more than 30 seconds. – ambe5960 Oct 30 '15 at 07:26
  • Add some more info. Are you inserting the csv data like firstname and lastname into the database like firstname='firstname', lastname='lastname'? – Wim Pruiksma Oct 30 '15 at 07:27

2 Answers2

2

Use "mysqlimport" command. It works fast and is suited for large CSV files.

mysqlimport --ignore-lines=1 \
            --fields-terminated-by=, \
            --local -u root \
            -p Database \
             TableName.csv
Ninju
  • 2,522
  • 2
  • 15
  • 21
1

Convert the CSV file into a file containing a SQL request in a text editor. Execute in console the request using:

mysql -u user -ppassword db_name < file_path.sql
Alexander Elgin
  • 6,796
  • 4
  • 40
  • 50