0
> with(wnd[wnd$ARR_DELAY < 180,], smoothScatter(DISTANCE, ARR_DELAY))
Warning messages:
1: In FUN(newX[, i], ...) :
  Reached total allocation of 3981Mb: see help(memory.size)
2: In FUN(newX[, i], ...) :
  Reached total allocation of 3981Mb: see help(memory.size)
3: In KernSmooth::bkde2D(x, bandwidth = bandwidth, gridsize = nbin,  :
  Reached total allocation of 3981Mb: see help(memory.size)
4: In KernSmooth::bkde2D(x, bandwidth = bandwidth, gridsize = nbin,  :
  Reached total allocation of 3981Mb: see help(memory.size)
5: In smoothScatter(DISTANCE, ARR_DELAY) :
  Reached total allocation of 3981Mb: see help(memory.size)
6: In smoothScatter(DISTANCE, ARR_DELAY) :
  Reached total allocation of 3981Mb: see help(memory.size)

I have come across Reached total allocation warning before and sometimes I restart R other times I ignore the warning. I am still learning the basics and don't pay much heed to this warning. But would like to know little bit more about this warning. Should I even bother to restart R or ignore the warning and just keep on working. Thx

Bhail
  • 385
  • 1
  • 2
  • 18
  • 2
    Basically, it means that your data is too big for the available memory and R isn't able to complete the requested processes using the full dataset. Restarting R wipes out the currently stored data (assuming you didn't save your R session), freeing up your RAM, and allowing you to continue your work. Bottom line, you need more RAM. Being better about clearing out all unnecessary dataset/values may help short term, but if you need those datasets for additional processing in a pipeline you're going to continue to run into problems. – scribbles Aug 21 '15 at 19:29
  • 1
    It could also be that smoothScatter doesn't play well with the large number of points you're giving it. You could consider just using a sample of them. – Frank Aug 21 '15 at 19:32
  • 1
    @Frank - at its core, is smoothScatter not playing nice really any different? (not being snide) Is smoothScatter not able to accommodate large datasets for whatever reason (if so interested to learn more), or does smoothScatter generate large amounts of data that grows by some function of the size of the original dataset, causing R to bump up against RAM limitations that may not be seen with other methods? – scribbles Aug 21 '15 at 19:46
  • 1
    @scribbles I meant the "could also be" to mean that I do not know. I am not intimately familiar with smoothScatter, but I see that it uses 2D kernel regression (which is what's throwing the warning) and I'm sure that there are a wide variety of ways to do that memory-inefficiently. And yes, I think putting everything on hold to upgrade RAM would make quite a difference. Also, I just noticed that the 4 GB limit the OP is hitting could be coming from using 32-bit R, which has such a limit according to `help(memory.size)`... – Frank Aug 21 '15 at 20:00
  • 2
    Possible duplicate of [R memory limit warning vs "unable to allocate..."](http://stackoverflow.com/questions/15101045/r-memory-limit-warning-vs-unable-to-allocate) – Alex Jul 12 '16 at 04:50

0 Answers0