I defined a table with an unique objected (generated)
CREATE TABLE Table1(
Id bigint IDENTITY PRIMARY KEY
,Version VARCHAR(10) NOT NULL
,Date DATE NOT NULL
,Code VARCHAR(10) NOT NULL);
INSERT INTO Table1(Version,Date,Code) VALUES ('1.0','2018-04-16','8615');
INSERT INTO Table1(Version,Date,Code) VALUES ('1.0','2018-04-16','2285');
INSERT INTO Table1(Version,Date,Code) VALUES ('1.0','2018-04-16','11625');
Now I have a .csv. file with more information to insert. I suppose to use BULK INSERT like
BULK INSERT Table1
FROM 'C:\test.csv'
WITH (
FIELDTERMINATOR = ','
,ROWTERMINATOR = '\n'
)
the input file contains:
1.0,2018-04-16,240061
1.0,2018-04-17,3435
1.0,2018-04-18,2143
1.0,2018-04-19,44
1.0,2018-04-20,2453
1.0,2018-04-01,2012
1.0,2018-04-22,123
1.0,2018-04-23,9887
1.0,2018-04-30,57
1.0,2018-05-1,576
1.0,2018-05-8,35
1.0,2018-05-9,867
1.0,2018-05-10,555
....
running the BULK INSERT statement results in errors
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (Id).
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 1 (Id).
What is the best way to insert a lot of data from the csv into de table? (more then 10000 rows)