0

Keep in mind, I am not an experienced database person.

I have a csv flatfile that I want to import into a SQL database. Its not THAT big, 35 mbs or so with 2500 rows or so with 3900 columns both rows and columns are UNIQUE. 1st row is the header.

I am having a hard time importing this to the local MySQL database that I am using with Wamp. It either times out or I have to deal with a spinning death wheel for almost an hour before I begin to get impatient about the fact that I am uploading 35 mbs and cancel the upload.

I also find it hard to accept that I have to add each column one by one by typing INSERT for EACH column.

Is there a way to upload this to MySQL efficiently? Thanks in advance.

charlie090
  • 318
  • 3
  • 17
  • 3900 columns - seriously? – P.Salmon Apr 04 '20 at 15:07
  • yea....is that a bad thing? – charlie090 Apr 04 '20 at 15:08
  • You should read https://dev.mysql.com/doc/refman/8.0/en/column-count-limit.html. – P.Salmon Apr 04 '20 at 15:12
  • from what I gather I basically have to divide this into smaller sets/tables? – charlie090 Apr 04 '20 at 15:19
  • If you are going to use a relational database like mysql you should normalise your data. You could chop up the csv file into digestible chunks, load them to staging tables and then push the data into their final tables or if you may chose to chop up your data to match the final db tables before you import. Either way it looks like you have a lot of work to do. – P.Salmon Apr 04 '20 at 15:23
  • Is there a way to automate this? this process of chopping and loading? – charlie090 Apr 04 '20 at 15:25
  • Nope - only you now what your final design is, You could use something like python, vb or whatever tech you are comfortable with. – P.Salmon Apr 04 '20 at 15:25
  • the fact that I have reached the 65535 FillWeight signifies that I wont be able to fit this dataset in one table? – charlie090 Apr 04 '20 at 15:27

0 Answers0