I have a text file that contains about 2.86 billion records that I have to import in R data frame. My server has 64GB RAM and 48 cores. When I read file via read.table
command, about 25 GB RAM was being used. But before it finished, An error occured
"runs out of space"
I think, It is due to some data frame limit. Can I have a dataframe of 2.86 billion rows (any limit problem) ?If yes then how I can import my data or to analyze such data. Input data file size is about 18 GB. My data file contains single column( an integer value). I have find a solution that have given some options but that data in much that my case.