I am running Docker on windows 10 and created a container which does the training using r-masked ngrams to detect clean or malware domains. It runs perfectly on a dataset with 32k x2 (clean and malware both 64k rows) rows but on another with 370k rows i get "Unable to allocate vector of size 1.5Gb". Is there any way to increase docker limit, im using a computer with 8gb of ram.
Asked
Active
Viewed 119 times
0
-
1Does this answer your question? [Increasing (or decreasing) the memory available to R processes](https://stackoverflow.com/questions/1395229/increasing-or-decreasing-the-memory-available-to-r-processes) – mhovd May 13 '21 at 11:11
-
It could be a number of things, including: docker (not R) limits on memory/resources; or inefficient R code. The first is likely better-suited for superuser.com or similar. The second would require an audit of your code. You might get away with it here on SO if the code is not egregious, but once the code block starts paging, it becomes a visual deterrence and flies in the face of ***minimal** reproducible example*, in which case perhaps [codereview.se] would be a better place. – r2evans May 13 '21 at 13:08
1 Answers
0
Although the fact that the information provided for you is limited to make a diagnosis, could be a good idea to try to increase "memory.limit"
"memory.limit() reports or increases the limit in force on the total allocation"
Even with limitations in your RAM of your Windows PC, it is possible to physically extend the RAM without buy more ram.
Please see how to use: "Physical extension of RAM" https://learn.microsoft.com/en-us/windows/client-management/introduction-page-file
Then, try, for example:
memory.limit(size = 32768)
(it could be a good idea to start your CMD/Rstudio as admin):
Also see: the help for this function
?memory.limit

cigien
- 57,834
- 11
- 73
- 112
-
i tried this but on docker wasn't working. I run it on a computer with 32gb of ram using that function that you mentioned. – Daniel Ago Jul 25 '21 at 13:20