-1

I am trying to read 4Gb .txt file using "fread" function:

mydata<-fread("myfile.txt")

But after reading I get the following error:

Error: cannot allocate vector of size 193.8 Mb
In addition: Warning messages:
1: In lapply(globals, function(name) { :
  Reached total allocation of 4095Mb: see help(memory.size)
2: In lapply(globals, function(name) { :
  Reached total allocation of 4095Mb: see help(memory.size)

Could anyone explain me what does it mean and would should I do to avoid this error, please?

Thank you!

user45415631
  • 175
  • 2
  • 11

1 Answers1

0

The error messages actually give you the direct cause. You are trying to read a 30Gb file into 4Gb of RAM. The expensive solution is to upgrade your machine to 32GB of RAM.

Unfortunately, R keeps the entire environment in RAM at all times.

The less expensive solution is to process the dataset in chunks.

You will find some help here and also in this article

Community
  • 1
  • 1
kdopen
  • 8,032
  • 7
  • 44
  • 52
  • Sorry, it`s not 30Gb file but around 4GB, which is a little bigger than my memory is. Do you know any other fast methods of reading big files? Reading compressed files maybe? – user45415631 Feb 17 '15 at 01:33
  • No, they've all got be uncompressed eventually. – kdopen Feb 17 '15 at 01:38
  • Yes, you need to read and process it in smaller pieces. But you also need to use `rm()` to free up memory. If you read 5 2Gb chunks without deleting any, you are still trying to get 10Gb into RAM. – kdopen Feb 17 '15 at 14:08