0

I have a dataframe that I want to upload to a SQL Server database.

I have been looking at the pandas to_sql method but I can't seem to get it to work.

My dataframe is say 500 rows with 3 columns

column name      data type
dateLg           datetime
temperature      float64
city             object

And so my SQL table is called tblCityTemperature

  column name          data type
  DateLeg              datetime
  Temp                 float
  CC                   nvarchar(20)

Is there a way of mapping the columns in a dataframe to my sql table?

I tried the following where tblColNames is the three names in my sql table.

df.to_sql("tblPrices", cnxn, index_label=tblColNames)

I get the error message below which doesn't really make sense to me

ValueError: Length of 'index_label' should match number of levels, which is 1

mHelpMe
  • 6,336
  • 24
  • 75
  • 150

1 Answers1

2

You can change your columns before to_sql

df.columns=['DateLeg','Temp','CC']

Then you just need

df.to_sql("tblPrices", cnxn, if_exists ='append')
BENY
  • 317,841
  • 20
  • 164
  • 234
  • Thanks, does it matter if the columns are in a different order to the sql they are being appended to? I'm getting this error... DatabaseError: Execution failed on sql 'SELECT name FROM sqlite_master WHERE type='table' AND name=?;': ('42S02', "[42S02] [Microsoft][ODBC SQL Server Driver][SQL Server]Invalid object name 'sqlite_master'. (208) (SQLExecDirectW); [42S02] [Microsoft][ODBC SQL Server Driver][SQL Server]Statement(s) could not be prepared. (8180)") – mHelpMe Feb 24 '19 at 19:42
  • I am to get data out of the sql table just fine – mHelpMe Feb 24 '19 at 19:42
  • @mHelpMe if the name same it is ok , and you may need adding index=False as well since you append to an exist table – BENY Feb 24 '19 at 19:45
  • 1
    @mHelpMe also it is better check the way mentioned here https://stackoverflow.com/questions/43136121/questions-about-pandas-to-sql, change the way you are using – BENY Feb 24 '19 at 19:49
  • thanks just trying that now... still having database issues. Will keep plugging away at it – mHelpMe Feb 24 '19 at 20:41