0

A few times when dealing with modifying large objects (5gb), on a windows machine with 30gb of RAM, I have been reciving an error Reached total allocation of 31249Mb: see help(memory.size). However the process seems to complete, i.e. I get a file with what looks like the right values. Checking every bit of a large file for exactly the right returns by cutting it up and comparing it to the right section is time consuming, but when I've done it it appears that the returned objects are correct with my expectations.

What risks/side effects can I expect from this error? What should I be checking? Is the process automatically recovering because I'm getting back the returns I'm expecting, or are the errors going to be more subtle? My entire analysis process is being written using tidyverse, does this mean I can rely on good error handling from Hadley et al., and is that why my process is warning, but also completing?

N.B. I have not included any attempt at an MWE, as every machine will have different limitations of what memory is available, though happy to be shown and MWE for this kind of process if there are suggestions.

DaveRGP
  • 1,430
  • 15
  • 34

1 Answers1

0

Use memory.limit(x) where x is the amount of MB of memory to give it.

See link for more details: Increasing (or decreasing) the memory available to R processes

Adam
  • 648
  • 6
  • 18
  • Thanks, but I'm aware of this. My question is specifically about error checking if this error occurs, but an apparently correctly processed object is returned. Further, my memory limit seems correctly set: 30Gb approx == 31249Mb – DaveRGP Jun 29 '17 at 16:13
  • 1
    sory about that, didn't see the second part of the question. some functions expand the data greatly while reading and using them, which is what I beleive happened here. Largest object I've ever used was 2.2GB and I didn't get this problem. Would it make sense to break the process into chunks rather than error checking in chunks? – Adam Jun 29 '17 at 16:34