0

I have a table that I want to upload from one platform (SQL SERVER) into Teradata. Normally I use the TeradataSQL module in python to do this but this is the first time I run into issues with NULL values. When I upload it into pandas, they show as NaN. I get errors when uploading the dataframe into Teradata, even though data types are correct. For example, I have a column with NaN values, being inserted into a column with a datatype of Decimal(15,4). if I leave it with NaN values, I get an error of "Numeric overflow occurred during computation". However if I use fillna(0) to replace those NaN values, I get no errors.

Similar with strings, I get error such as "Batch row 4 bound parameter 35 type VARCHAR (448) differs from batch row 1 type FLOAT (480)". When I look at the dataframe, I see rows 0 to 3 are showing NaN, and once I fill the blanks there, the error will disappear. the datatype of the target column is varchar (255).

Is there a way to insert the values of the dataframe even when some values show NaN?

chulo
  • 53
  • 8
  • You may need to convert to `None` instead of NaN (which is FLOAT), as described here https://stackoverflow.com/questions/14162723/replacing-pandas-or-numpy-nan-with-a-none-to-use-with-mysqldb/54403705#54403705 – Fred Aug 19 '21 at 18:18
  • @Fred , that was it. Thank you so much. – chulo Aug 19 '21 at 23:32

0 Answers0