I am using postgresql where on daily basis I have to UPDATE
the values of selective columns. My source is in pandas df
. Since there are more than 1000 rows in df
, I am using df.to_sql()
method by sqlalchemy
.
engine = sqlalchemy.create_engine(...)
df.to_sql(mytable,con=engine,if_exists='replace')
But there is a problem I am facing. Every time I use the above code snippets, it actually DROP
the existing table in DB, then CREATE
a new table with the column names as given in df
. Which means the column names are getting changed!! Besides, total number of columns in df
is less than that of in mytable
in DB.
I have gone through this SO thread but I am not kind of looking this sort of solution.
Is there any other way to tackle this situation? Or is there any way to UPSERT
row wise values into selective columns of database table? (like for row,index in df.iterrows():
)