0

I have a file of approximately 40Gb and i need to perform some analytics activity on it but my ram is limited to 4gb any suggestions how to proceed?

  • I've voted to close this answer because it is too broad. You will get a better answer if you show what you have tried to do and why it didn't work. – Mark Lakata Nov 22 '16 at 17:54
  • What kind of analytics are you trying to do? Maybe you could process it in chunks? – William Nov 22 '16 at 18:25
  • see the "Large Memory and Out-of-Memory Data" section of the [High-Performance Computing Task View](https://cran.r-project.org/web/views/HighPerformanceComputing.html) – Ben Bolker Nov 22 '16 at 19:57

2 Answers2

1

Check out the LaF package. It allows you to do column selection and filtering on files that would not otherwise fit into memory.

Documentation and examples are a little thin on the ground which is a shame given that the syntax is a little idiosyncratic. This may help:

https://stackoverflow.com/a/24716798/1427069

Community
  • 1
  • 1
Jacob
  • 3,437
  • 3
  • 18
  • 31
0

Increase Your Virtual Memory: increase the size of the paging file

Taufiq Rahman
  • 5,600
  • 2
  • 36
  • 44