I have several large R data.frames that I would like to put into a local duckdb database. The problem I am having is duckdb seems to load everything into memory even though I am specifying a file as the location.
Also, it isn't clear to me the correct way to establish a connection (so I'm not sure if this has something to do with it). I have tried:
duckdrv <- duckdb(dbdir="dt.db", read_only=FALSE)
dkCon <- dbConnect(drv=duckdrv)
and also:
duckdrv <- duckdb()
dkCon <- dbConnect(drv=duckdrv, dbdir="dt.db", read_only=FALSE)
Both work fine, meaning I can create tables, use dbWriteTable, run queries, etc. However, the memory usage is very high (about the same size as the data.frames). I think I read somewhere that duckdb defaults to using a certain % of the available memory which won't work for me because the system that I am using is a shared resource. I also want to run some queries in parallel which will drive memory usage even higher.
I have tried this:
dbExecute(dkCon, "PRAGMA memory_limit='1GB';")
but that doesn't seem to make a difference, even if I close the connection, shutdown the instance and reconnect.
Does anyone know how I can fix this problem? RSQLite, also has high memory usage temporarily when I am writing data to a table but then it goes back to normal and if I open a read only connection it isn't an issue at all. I would like to get duckdb working because I think the queries are supposed to be much faster. Any help would be appreciated!