I have a question similar to this Stackoverflow post.
How can I create a persistent table from a SQL query in a database (I use a DB2 database)? My goal is to use a table from one schema and to permanently create a more or less modified table in another schema.
What works so far is to pull the data to R
and subsequently create a table in a different schema:
dplyr::tbl(con, in_schema("SCHEMA_A", "TABLE")) %>%
collect() %>%
DBI::dbWriteTable(con, Id(schema = "SCHEMA_B", table = "NEW_TABLE"), ., overwrite = TRUE)
However, I'd like to incorporate the compute()
function in a dplyr
pipeline such that I do not have to pull the data into R, that is, I'd like keep the data on the database. As a side note: I do not know how I would substitute the DBI
'sdbWriteTable()
for dplyr
's copy_to()
– being able to do that would also help me.
Unfortunately, I am not able to make it work, even after reading ?compute()
and its online documentation. The following code framework does not work and results in an error:
dplyr::tbl(con, in_schema("SCHEMA_A", "TABLE")) %>%
dplyr::compute(in_schema("SCHEMA_B", "NEW_TABLE"), analyze = FALSE, temporary = FALSE)
Is there a solution for using compute()
or some other solution applicable to a dplyr pipeline?