0

I have a text file that contains about 2.86 billion records that I have to import in R data frame. My server has 64GB RAM and 48 cores. When I read file via read.table command, about 25 GB RAM was being used. But before it finished, An error occured

 "runs out of space"

I think, It is due to some data frame limit. Can I have a dataframe of 2.86 billion rows (any limit problem) ?If yes then how I can import my data or to analyze such data. Input data file size is about 18 GB. My data file contains single column( an integer value). I have find a solution that have given some options but that data in much that my case.

Community
  • 1
  • 1
Hafiz Muhammad Shafiq
  • 8,168
  • 12
  • 63
  • 121
  • try the fread function in data.table package. – tushaR Mar 20 '17 at 06:46
  • I like the `readr` package and read files chunk wise. Maybe this helps if you don't need the complete data in memory. – drmariod Mar 20 '17 at 07:02
  • 1
    You may find [Row limit for data.table in R using fread](http://stackoverflow.com/q/17596249/3817004) useful. – Uwe Mar 20 '17 at 07:50

0 Answers0