I have a .txt file that is roughly 28 gb and 100 million rows, and originally we were using Python and a mix of Pandas and SQLite3 to load this data into a .db file that we could query on. However, my team is more familiar with R, so we want to load the .db file into R instead. However, that comes with a memory limit. Is there a workaround for this error? Is it possible to partially load some of the data into R?
library(RSQLite)
filename <- "DB NAME.db"
sqlite.driver <- dbDriver("SQLite")
db <- dbConnect(sqlite.driver,
dbname = filename)
## Some operations
dbListTables(db)
mytable <- dbReadTable(db,"TABLE NAME")
Error: vector memory exhausted (limit reached?)