I have a csv file with 2550 columns and I want to import it into postgreSQL.
For importing csv in postgreSQL at first I should create the table and them use \copy
to copy from csv to table. but what if the table has huge amount of columns like my case that I cannot create table manually.
any solution?
Update
Data structure is as following: dZ(01) till dZ(2550) are basically between -50 to +50:
id | date | time | localtime | pid | dZ(0)..dZ(1)......dZ(2550)|
---|---------|-----------|-----------|-----|---------------------------|
17|11-11-2014| 16:33:21 | 1.45E+15 |2375 |0 0 0 0 0 -1 0 -1 0 -5 -10|
CSV structure: (I used '';' delimiter)
17;13-11-2014;08:09:37;1.45E+15;4098;0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 -4 3 0 -2 3 -2 1 0 0 1 1 3 -2 3 4 2 -2 -2 ....
This is one line of data.