1

I have a node process that I use to add key-values to an object. When I get to about 9.88 million keys added, the process appears to hang. I assumed an out-of-memory issue, so I turned on trace_gc and also put in a check in the the code that adds the keys:

const { heapTotal, heapUsed } = process.memoryUsage()
if ((heapUsed / heapTotal) > 0.99) {
  throw new Error('Too much memory')
}

That condition was never met, and the error never thrown. As far as --trace_gc output, my last scavenge log was:

[21544:0x104000000]  2153122 ms: Scavenge 830.0 (889.8) -> 814.3 (889.8) MB, 1.0 / 0.0 ms  allocation failure

Mark-sweep, however, continues logging this:

[21544:0x104000000]  3472253 ms: Mark-sweep 1261.7 (1326.9) -> 813.4 (878.8) MB, 92.3 / 0.1 ms  (+ 1880.1 ms in 986 steps since start of marking, biggest step 5.6 ms, walltime since start of marking 12649 ms) finalize incremental marking via task GC in old space requested

Is this output consistent with memory issues?

I should note that having to add this many keys to the object is an edge-case; normally the range is more likely in the thousands. In addition, the keys are added during a streaming process, so I don't know how many are required to added at the outset. So in addition to trying to figure out what the specific problem is, I'm also looking for a way to determine that the problem will likely occur before the process hangs.

rgwozdz
  • 1,093
  • 2
  • 13
  • 26
  • It sounds like you're running into v8's default max memory size. See https://stackoverflow.com/questions/7193959/memory-limit-in-node-js-and-chrome-v8 – generalhenry May 03 '18 at 20:21

0 Answers0