1

I have a flat file with almost 300 columns that needs to be imported in SQL server. It seems that if I use SSIS it can't read the csv file if I mark it as unicode. It seems that it looses ability to recognize cr and lf.

The specified header or data row delimiter "{CR}{LF}" is not found after scanning 524288 bytes of the file "...\contact.csv". Do you want to continue scanning this file?

what am I doing wrong?

EDIT Based on comments, it seems I need to clarify - yes I did check that {CR}{LF} is present at the end of each line, and that it's set as a row delimiter in the connector.

The problem is with the "unicode" checkbox. If I uncheck it, the file is read fine. If I check it, it doesn't find {CR}{LF} any more.

It also doesn't matter what encoding I set on the file as that only effects the default "code page" selection.

Mordechai
  • 718
  • 1
  • 8
  • 23
  • 1
    If you open the file in notepad++ and turn on "Show All Characters" can you see the {CR}{LF}, it might just be {LF}, check out this post https://stackoverflow.com/questions/6268205/why-doesnt-ssis-recognize-line-feed-lf-row-delimiter-while-importing-utf-8-fl?rq=1 – Ockert May 11 '22 at 18:35
  • In addition to what Ockert has said, make sure that in your flat file connection manager the delimiter is set appropriately. – cdbullard May 11 '22 at 21:14
  • thanks, I did check with notepad++ and {CR}{LF} is there, and the delimiter is set correctly. – Mordechai May 12 '22 at 05:39

1 Answers1

0

ok, after a while I found an answer.
The unicode checkbox is still not working, but if you can go to the advanced section of the flat file manager and set your string columns to unicode. It's kind of tedious, and I don't know what I would do if I had 200 columns, but for my small data set it worked.

Mordechai
  • 718
  • 1
  • 8
  • 23