I have an API built with R plumber that connects to a PostgreSQL database using RPostgreSQL and pool (although this would also apply if I was using a Shiny app):
# create the connection pool
pool <- dbPool(
drv = PostgreSQL(),
host = Sys.getenv("DB_HOST"),
port = 5432,
dbname = "db",
user = Sys.getenv("DB_USER"),
password = Sys.getenv("DB_PASSWORD")
)
# start the API
pr <- plumb("plumber.R")
# on stop, close the pool
pr$registerHooks(
list("exit" = function() { poolClose(pool) })
)
I want to import new data every day. The easiest way is to create a new database and promote it to production:
CREATE DATABASE db_new;
-- create the tables
-- bulk-insert the data
SELECT pg_terminate_backend (pid) FROM pg_stat_activity WHERE datname = 'db';
DROP DATABASE db;
ALTER DATABASE db_new RENAME TO db;
This is fast and minimizes downtime. The problem is that pool
then loses is connection to the database and doesn't automatically attempt to reconnect:
> tbl(pool, "users")
Error in postgresqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not Retrieve the result : FATAL: terminating connection due to administrator command
server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
)
Even if I wasn't replacing the database every day, DB servers occasionally restart and that will also cause my app to break. Reconnection doesn't seem to be a feature of pool, RPostgreSQL, nor DBI. Does anyone know a way to deal with this problem?