3

I have a really large csv file with 300 columns and 12000 rows.

I want to import this once into the database using phpmyadmin without breaking the csv.

Beacuse at the moment it can only take upto 500 rows of the csv.

I have changed my php.ini file

max_execution_time=10000
memory_limit=1000M
post_max_size=8000M

And also the mysql config file my.ini

innodb_lock_wait_timeout = 900800
max_allowed_packet = 9000M

Sill after doing this I get the error 'mysql has gone away'.

Is there anything else I can do? or is there a better way to import my large CSV file into phpmyadmin

Victor Njoroge
  • 353
  • 2
  • 9
  • 22
  • 4
    Don't do this using phpmyadmin; do it using LOAD DATA INFILE from the MySQL command prompt – Mark Baker Mar 09 '14 at 18:37
  • What Mark Baker said and here is an example of that: http://stackoverflow.com/questions/11430223/import-large-csv-file-using-phpmyadmin/11430498#11430498 – Fabien Snauwaert Sep 11 '14 at 06:20
  • 1
    Someone who does not have access to mysql from the command-line could try merely chunking the CSV file into smaller subfiles. (Less convenient but should get the job done eventually. Increasing size limits in php.ini helps but is not always sufficient.) – Fabien Snauwaert Sep 11 '14 at 06:21
  • phpmyadmin could automatically create the table structure, that is a need which `load data infile` does not have. – zhy Apr 11 '16 at 09:01
  • I just realized that determining the table structure by setting a maximum field length is a good idea, thus we need use `load data` only once! – zhy Apr 11 '16 at 10:24

0 Answers0