I'm having memory issues when generating many plots and writing them to png/jpeg/eps devices.
require(ggplot2)
...
render <- function(x) {
fileName=paste(chartDir, "/", x$PACKED[1], ".png", sep="")
x <- x[,c("EFF_DATE", "variable", "value")]
png(fileName, width=1920, height=1000, units="px")
print(qplot(EFF_DATE, value, data = x, facets = variable ~ ., geom="line"))
dev.off()
}
d_ply(molten, "PACKED", render, .progress="tk")
The code progresses nicely for the first ~80 plots and then behaves like a fork bomb thereafter, consuming 100% of RAM within a very short time. I've checked the sizes of x supplied to qplot and they're all roughly the same, so it's not the data. The code runs fine when I comment the png line. I get the same issue when I try to use ggsave from the ggplot2 library.
If anyone has an inkling as to why this is happening then I'd love to hear it. However, in anticipation that nobody does, can someone tell me if there is a nice heap analysis tool that I can run inside R to investigate where the memory is going and if there's anything I can do to clean up on the fly? I'd really rather not have to resort to debugging the binary.
Best wishes, Graham.