Possible Duplicate:
Trimming a huge (3.5 GB) csv file to read into R
Does R have a good way to transparently deal with data that does not fit into memory? There are a few packages for dealing with big data, but I don't want to make a decision to deploy one without understanding what the actual interface is.
For example, I might have a collection of records that together do not fit into memory. However, if I would load a subset, it is represented by a very simple data frame and I can do all sort of useful selections and aggregations on that data. Is there some sort of package that would allow me to treat the whole collection as a single data frame and perform the same operations on it transparently?