2

Loading a pandas dataframe into postgres using odo does not work.

I first use SQLalchemy to create a table object. The name of the table on my local postgres instance is testaccounts.

meta = MetaData(bind=engine)
table = Table('testaccounts', meta, schema='custom')
odo(df, table)`enter code here

When I tried the code above, I got this error:

ValueError: Column names of incoming data don't match column names of
existing SQL table
Names in SQL table: []
Names from incoming data: ['id', 'name', 'country']

I also tested this by manually creating the table with these 3 columns but it still shows the same error. I think i'm doing something wrong but not sure what. Can someone point me in the right direction?

Lucas Neo
  • 53
  • 6
  • 1
    Does `df.to_sql('testaccounts', engine, schema='custom')` work? – joris Jun 14 '16 at 07:48
  • Yes it works but. I tried using it to insert the first row and then use `odo` to copy the remaining rows. I can't use `to_df` for the whole table because it is very slow. – Lucas Neo Jun 15 '16 at 03:13

0 Answers0