1

I am trying to store all the contents of a CSV file into a table row.

Table structure:

Field   Type   Comment
id      bigint(15) NOT NULL  Primary key for table
file_id bigint(15) NULL      Reference key from filemapper table
file_content longtext NULL    Content of the File
dataTime  timestamp NOT NULL

Problem arises when I try to do insert in this table with a CSV file, I get an Error "MySQL server has gone away".

I am not trying to split the csv file fields into columns in mysql table

Can anyone suggest another way to do this without causing the error?

Rohit Batra
  • 674
  • 4
  • 18
  • 1
    Similar question: http://stackoverflow.com/questions/12425287/mysql-server-has-gone-away-when-importing-large-sql-file – Jake Bathman Feb 13 '15 at 04:50
  • 1
    @JakeB. thanks a lot. I thought that wouldn't be the case with my issue as the Error Code was 2013 not 2006 which was mentioned in that question. – Rohit Batra Feb 13 '15 at 04:57
  • possible duplicate of [Mysql server has gone away while storing large (2MB) audio file in LONGBLOB](http://stackoverflow.com/questions/28492429/mysql-server-has-gone-away-while-storing-large-2mb-audio-file-in-longblob) – Leandro Papasidero Feb 13 '15 at 05:19

2 Answers2

2
MySQL server has gone away has foolwing causes and sollutions.
1.Server time out and closed the connection.To fix,check that "wait_timeout" mysql variable in your my.conf Coonfiguration file is large enough or not.
2.Server dropped an incorrect or too large pocket.If mysqld gets a packet that is too large or incorrect it assumes that something gone wrong with connection and connection closed.To fix this problem increase maximal packet size limit "max_allowed_packet" in my.conf file e.g. set  max_allowed_packet = 128M 
You Can GO to this link : http://stackoverflow.com/questions/12425287/mysql-server-has-gone-away-when-importing-large-sql-file
logsv
  • 544
  • 6
  • 17
0

for fixing this thing you have to increase the maximal packet size limit

MAX_ALLOWED_PACKET

in .cnf file for eg:

max_allowed_packet = 128M

Restart

sudo /etc/init.d/mysql restart

Using LOAD DATA INFILE you can do that.

you need to set fields TERMINATED BY

FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\\'
LINES TERMINATED BY '\n' STARTING BY ''

your syntax should looks like this

LOAD DATA INFILE ‘path/to/example.csv’ INTO TABLE example FIELDS TERMINATED BY ‘,’ LINES TERMINATED BY ‘\n’ IGNORE 1 LINES ;

Hope this works for you.

sreenivas
  • 395
  • 3
  • 17