2

I am using an ODBC source connected to the Hadoop system and read the column PONum with value 4400023488 of datatype Text_Stream DT_Text]. Data is converted into string [DT_WSTR] using a data conversion transformation and then inserted into SQL Server using an OLE DB Destination. (destination column's type is a Unicode string DT_WSTR)

I am able to insert Value to SQL Server table but with an incorrect format 㐴〰㌵㠵㔹 expected value is 4400023488.

Any suggestions?

Hadi
  • 36,233
  • 13
  • 65
  • 124
  • Why using ODBC source rather than the [SSIS Hadoop Components](https://www.sqlshack.com/importing-and-export-data-using-ssis-hadoop-components/)? Can you show the ODBC source configuration? – Hadi Mar 12 '22 at 17:47
  • 1
    Using DSN i connected to Hadoop though ODBC Source. wrote SQL Query to read the required columns from table. The column PONum data type in Hadoop is string. when previewed in SSIS the column datatype is text_stream [DT_Text]. In the destination table PONum datatype is nvarchar(255). when previewed in SSIS the datatype is unicode string [DT_WSTR]. So i get error "cannot convert between unicode and non-unicode string data types." To solve unicode issue i followed "https://stackoverflow.com/questions/15547539/ssis-convert-between-unicode-and-non-unicode-error" along with DATA CONVERSION – user6787720 Mar 13 '22 at 13:28

1 Answers1

0

I have two suggestions:

  1. Instead of using a data conversion transformation, use a derived column that convert the DT_TEXT value to DT_STR before converting it to unicode:
(DT_WSTR, 4000)(DT_STR, 4000, 1252)[ColumnName]

Make sure that you replace 1252 with the appropriate encoding.

Also, you can use a script component: SSIS : Conversion text stream DT_TEXT to DT_WSTR

  1. Use the Hadoop SSIS connection manager and HDFS source instead of using ODBC:
Hadi
  • 36,233
  • 13
  • 65
  • 124
  • using derived column resolved the issue, thanks a lot for your support – user6787720 Mar 14 '22 at 04:57
  • I also have one more conversion error for column pndate from hadoop datatype is "database timestamp with precision [DT_DBTIMESTAMP2]" with value "2021-10-25 23:55:38.845818000" and SQL datatype is database timestamp [DT_DBTIMESTAMP] hence facing the error "Conversion failed because the data value overflowed the specified type." How to do conversion ? – user6787720 Mar 14 '22 at 09:53
  • @user6787720 [DT_DBTIMESTAMP only supports 3 milliseconds fractions](https://stackoverflow.com/questions/70528019/why-is-the-ssis-variable-not-passing-the-milliseconds-part-into-the-query/) – Yahfoufi Mar 14 '22 at 09:57
  • @Yahfoufi - is that going to resolve my error" "Conversion failed because the data value overflowed the specified type."" ? I am using ODBC Source and OLE DB is my destination – user6787720 Mar 14 '22 at 11:39
  • 1
    @user6787720 try using a derived column with the following expression: `(DT_DBTIMESTAMP)[date column]`. I think this will keep only three fractions. – Yahfoufi Mar 14 '22 at 11:55
  • @Yahfoufi - the errored column was not necessary hence ignored the column, thanks for your assistance – user6787720 Mar 16 '22 at 08:58