1

How can I limit the amount of working memory R (R x64 3.4.1, via RStudio) has access to while executing a process on a Windows machine?

Normally if I accidentally join two dataframes on the wrong keys, or write too large of a loop, I can easily interrupt the process with CTRL-SHIFT-F10, the stop button, Task Manager etc.

Unfortunately, I'm currently working remotely and this is presenting a problem.

My employer uses Microsoft Remote Desktop Connection to establish an RDP session, and whenever my remote machine's memory utilization gets relatively high (ie. 'very high'), the RDP session will drop unexpectedly and I'm unable to remote back in to kill the R process. Presumably, since there is not enough working memory available to establish or maintain the RDP session itself.

As you can imagine, this is a huge bummer. Creating a trouble ticket with IT to physically restart my machine while I sit on the couch at home can take over an hour and, frankly, is getting a little embarrassing.

I've investigated via memory.limit() and it looks like R is set to utilize all ~16000MBs of my machine's available memory. From what I've read here on StackOverflow it isn't possible to reduce the memory limit, only increase it.

Is this true? If so, are there any other options besides being extra careful with memory management and having IT on speed-dial? Does Windows itself maybe provide a solution?

TylerH
  • 20,799
  • 66
  • 75
  • 101
  • Have a look at https://stackoverflow.com/questions/23950132/how-to-set-memory-limit-in-rstudio-desktop-version and setting environment variable R_MAX_MEM_SIZE; plus if you are working with RStudio also read the answers there – GWD Apr 10 '20 at 11:49

0 Answers0