I have a very large dataframe that I want to load and process using R on windows. It contains more than 30k features with 50k rows. I am having memory issues while loading such a large dataset using
fm <- read.table(<filename>, sep=",")
Is there a way to load a very large dataframe in R?
I tried converting it to a sparse matrix. But even that is throwing error. Also, sparse matrix is not supported by the tree learning algorithm.
What is the suggested way to work with large dataframes in R?