0

I have data in the csv file similar to this:

Name,Age,Location,Score
"Bob, B",34,Boston,0
"Mike, M",76,Miami,678
"Rachel, R",17,Richmond,"1,234"

While trying to BULK INSERT this data into a SQL Server table, I encountered two problems.

  1. If I use FIELDTERMINATOR=',' then it splits the first (and sometimes the last) column
  2. The last column is an integer column but it has quotes and comma thousand separator whenever the number is greater than 1000

Is there a way to import this data (using XML Format File or whatever) without manually parsing the csv file first?

I appreciate any help. Thanks.

Malganis
  • 338
  • 3
  • 5
  • try DTSWizard. It comes along with all versions of sql server - free and paid. – Erran Morad Mar 21 '14 at 17:34
  • Do you have to do this programatically, or is SQL Server Import and Export Wizard an option? – maelstrom Mar 21 '14 at 17:34
  • Sorry, I have to do this programmatically in T-SQL. – Malganis Mar 21 '14 at 17:51
  • @Malganis Just so you know, the import/export wizard can do this without any issues as you can specify a text qualifier `"` for the data. I don't think there is any way to do that using `bulk insert` in t-sql. You're best bet might be to preprocess the file and change the delimiter to something else (using a regex tool for instance), or import every row as is (without splitting), and split it after importing. – jpw Mar 21 '14 at 17:54
  • While `BULK INSERT` cannot do this (outside of inserting the whole line and parsing within SQL), there are many ways such as @Borat's DTSWizard that will. With a data format like that, all be somewhat complex to get working. – Philip Kelley Mar 21 '14 at 18:06
  • @jpw, do you know what is the import/export wizard a UI for? SSIS, bcp, something else? – Philip Kelley Mar 21 '14 at 18:08
  • @PhilipKelley I believe it is DTS (https://en.wikipedia.org/wiki/Data_Transformation_Services), but I'm not sure. – jpw Mar 21 '14 at 18:11
  • Since it has to be done programatically in tsql, is `OPENROWSET(BULK...FORMATFILE...)` acceptable? – Dave Mason Mar 21 '14 at 18:25
  • Did you find a solution to your issue? – Dave Mason Mar 25 '14 at 14:03
  • I am sorry to keep everyone in suspense. Basically, I couldn't get FORMATFILE to work in this specific case, so I ended up parsing the data manually. – Malganis Apr 07 '14 at 16:40

2 Answers2

0

You can parse the file with http://filehelpers.sourceforge.net/

And with that result, use the approach here: SQL Bulkcopy YYYYMMDD problem or straight into SqlBulkCopy

Community
  • 1
  • 1
Pleun
  • 8,856
  • 2
  • 30
  • 50
0

Use MySQL load data:

LOAD DATA LOCAL INFILE 'path-to-/filename.csv' INTO TABLE `sql_tablename` 
CHARACTER SET 'utf8' 
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
IGNORE 1 LINES;

The part optionally enclosed by '\"', or escape character and quote, will keep the data in the first column together for the first field.

IGNORE 1 LINES will leave the field name row out.

UTF8 line is optional but good to use if names have diacritics, like in José.

motorbaby
  • 634
  • 7
  • 18