1

I am trying to deal with issues of memory limitation in R. I was running a code that would generate monthly data output. It was all going fine, R saved all monthly csv data in the computer as expected but seemed the have the console frozen (although the code ran entirely). When I restarted it, it did not launch as expected, I had to wipe everything and reinstall windows. I downloaded the new version of R (version 4.2.1) and my code no longer runs because of memory limitation. I get the error message below.
Error: cannot allocate vector of size 44.2 Gb

I tried increasing the memory with memory.limit() as I did before but it seems like it is no longer supported by R (memory.limit() bug: "memory.limit() is not longer supported". Increasing memory).

How to deal with this?

Sunderam Dubey
  • 1
  • 11
  • 20
  • 40
PaulaSpinola
  • 531
  • 2
  • 10
  • You could probably improve your code to avoid needing that much memory, but as you do not show any code, I cannot help. – Robert Hijmans Jun 28 '22 at 18:37
  • Would be useful to see the structure of the data. It could be that a pivoted version of the data would be more efficient. Do you have some columns with highly replicated values? You can also consider reading and writing data frames as .parquet files using the `arrow` package instead of .csv. .parquet files take advantage of compression and can produce files a fraction of the size of a .csv and read and write much faster. – Arthur Jun 28 '22 at 19:10

0 Answers0