1

I have a table in Postgres database with some columns and values. I have imported this table into local memory, performed some computation on these columns and have a data frame with new values . Now I want this updated data frame to be placed back in the database in the same table.

drv <- dbDriver("PostgreSQL")

con <- dbConnect(drv, host = "*****", port = "****",
                 dbname = "sai", user = "sai", password = "password")

saix_account_demo <- dbReadTable(con = con, name = "saix_account")

...

dbWriteTable(con, name = "saix_account", value = saix_account_demo,
             row.names=FALSE, append=TRUE)`

I have performed dbWrtiteTable() with append==T and overwrite ==F. But I am facing an error saying primary key constraint violated. I understood the problem that I was trying to insert data instead of updating.

Parfait
  • 104,375
  • 17
  • 94
  • 125
  • It's easier to help you if you include a simple [reproducible example](https://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example) with sample input and desired output that can be used to test and verify possible solutions. What code did you try exactly? – MrFlick Apr 01 '19 at 18:43
  • Hi MrFlick , i have updated the post. Included the read and write codes. basically i need to udpdate a database table insetead of writing . with values from in memory data frame table which has similar structure to the one existing inside database. I could do this using SQLDF update by writing all the values with an update query, but i need a bulk update , To replace all the values in the database table which do not match with data frame , with my dataframe values . – Rishi Reddy Apr 01 '19 at 19:10
  • Use a temp, staging Postgres table that R dumps data frame into and then run your `UPDATE` statement to final table. – Parfait Apr 01 '19 at 19:15
  • Thank you parfait , got the solution. Used a temp staging table in the database . Used dbSendQuery() in R to move the columns i needed to update in the original table from the temp table. then deleted the temp table immediately after update . However I'm happy for time being , but as i scale up the data , could you suggest some efficient process for this use case. Thank you – Rishi Reddy Apr 02 '19 at 16:21

0 Answers0