I am currently using the sbrl()
function from the sbrl library. The function does the job of any supervised statistical learning algorithm: it takes data, and generates a predictive model.
I have a memory leak issue when using it.
- If I run the function in a loop, my RAM will get filled more and more, although I am always pointing to the same object.
- Eventually, my computer will reach the RAM limit and crash.
- Calling
gc()
will never help. Only closing the R session releases the memory.
Below is a minimal reproducible example. An eye should be kept on the system's memory management program.
Importantly, the sbrl()
function calls, from what I can tell, C
code, and also makes use of Rcpp
. I guess this relates to the memory leak issue.
Would you know how to force memory to be released?
Configuration: Windows 10, R 3.5.0 (Rstudio or R.exe)
install.packages("sbrl")
library(sbrl)
# Getting / prepping data
data("tictactoe")
# Looping over sbrl
for (i in 1:1e3) {
rules <- sbrl(
tdata = tictactoe, iters=30000, pos_sign="1",
neg_sign="0", rule_minlen=1, rule_maxlen=3,
minsupport_pos=0.10, minsupport_neg=0.10,
lambda=10.0, eta=1.0, alpha=c(1,1), nchain=20
)
invisible(gc())
cat("Rules object size in Mb:", object.size(rules)/1e6, "\n")
}