I have a data set with the dimension of 20 million records and 50 columns. Now I want to load this data set into R. My machine RAM size is 8 GB and my data set size is 35 GB. I have to run my R code on complete data. So far I tried data.table(fread), bigmemory(read.big.matrix) packages to read that data but not succeeded.Is it possible to load 35 GB data into my machine(8 GB)?
If possible please guide me how to overcome this issue?
Thanks in advance.