0

I have a very large dataframe that I want to load and process using R on windows. It contains more than 30k features with 50k rows. I am having memory issues while loading such a large dataset using

fm <- read.table(<filename>, sep=",")

Is there a way to load a very large dataframe in R?

I tried converting it to a sparse matrix. But even that is throwing error. Also, sparse matrix is not supported by the tree learning algorithm.

What is the suggested way to work with large dataframes in R?

cosmos
  • 2,414
  • 4
  • 23
  • 25
  • Similar question? [Quickly reading very large tables as dataframes in R](http://stackoverflow.com/questions/1727772/quickly-reading-very-large-tables-as-dataframes-in-r) – zx8754 Dec 04 '13 at 13:58
  • (1) Use 64bit OS and buy (lots) more RAM, (2) load data into a db (SQLite might be the simplest) and work with smaller pieces of it in R – joran Dec 04 '13 at 15:21

0 Answers0