For example, if I have a 2 GB csv
file, how much free RAM is needed to process it in R
, (specificly data.table
)? On the other side, if I have a 3 GB available memory, what's the largest data I could process using this computer? I couldn't find any calculation method on this; according to your experience, how do you calculate it?
Asked
Active
Viewed 52 times
0

Nick
- 8,451
- 13
- 57
- 106
-
That's an implementation issue. For example, Revolution R's implementation isn't restricted by the available RAM. – Panagiotis Kanavos Sep 22 '15 at 12:22
-
In my experience, it is too hard to calculate this kind of thing. – Stephen C Sep 22 '15 at 12:22
-
Columns and rows also have different overhead which means that the same amount of data will take different amounts of RAM for long, narrow tables vs wider, shorter ones – Panagiotis Kanavos Sep 22 '15 at 12:24
-
I'd say a good rule of thumb is that you need 3 times the dataset size in amount of RAM. This will enable you to load it, and perform operations and analyses on it. – Paul Hiemstra Sep 22 '15 at 12:25
-
@PanagiotisKanavos Thanks! Is that roughly true that wider (short) table takes more memory than narrow (long) one? – Nick Sep 22 '15 at 13:01
-
@PaulHiemstra I've read your answers to other linked questions - quite informative. Thank you. – Nick Sep 22 '15 at 13:04