I'm trying to open a rather large file in R to do analyses on its data. Currently, what I have is:
x = data.table::fread("file_name_here")
When I run this, line, I get
"Error: cannot allocate vector of size 2.6 Mb".
When I run the line without assigning a variable:
data.table::fread("file_name_here")
I instead get:
"Error in writeBin(bfr, con = out, size = 1L) :
'R_Calloc' could not allocate memory (10000000 of 1 bytes)".
I have tried doing:
Sys.setenv("VROOM_CONNECTION_SIZE" = 500000000)
But it didn't fix anything. I also cannot do memory.limit()
because it says it's no longer supported. Are there any other ways I can open this large file in R? Note that I do need to analyze all of its contents so I cannot trim down the file. Also, I think I need to stick with the fread
format because doing it in other ways caused the formatting of the file's contents to get messed up (and testing the code on a smaller but similar file, I do know fread
s keeps the file formatting consistent and correct).