I've run into a memory leak in a project and I've managed to recreate the problem in a simple example:
const { Readable } = require("stream");
for (let i=0; i<10000000; i++) {
const r = new Readable();
r.push(Buffer.from("a"));
if (i % 10000 === 0) {
const memory = process.memoryUsage();
console.log(memory.heapUsed, memory.heapTotal);
}
}
Run with node --max-old-space-size=1024 test.js
so it doesn't eat all your rams and demonstrates the crash consistently.
This is leaking memory every iteration, but I've no idea why. Destroying the stream seemingly does nothing. I'm not storing any references to the data, so GC should pick up on that and clean up after each loop, but it isn't?
Node 12.18.3
Update: This is the area in my actual project that has the issue. My project is a parser for replay files from an RTS game, intended to extract meaningful data from the replays which I'm then storing in a database. It's not time critical, but I would like it to be reasonably fast so it doesn't get backed up. There's been 1 or 2 problem replays which are much longer games and contain a ton more data, that's when I noticed my parser has this memory problem.
Using the expose-gc
flag and calling global.gc()
doesn't seem to help, but making it async and awaiting a small delay of ~10ms every now and then completely sorts out the issue without holding it up too much, but it feels like a bad solution.
const { Readable } = require("stream");
(async () => {
for (let i=0; i<10000000; i++) {
const r = new Readable();
r.push(Buffer.from("a"));
r.push(null);
if (i % 10000 === 0) {
const memory = process.memoryUsage();
console.log(memory.heapUsed, memory.heapTotal);
// global.gc(); // no luck
await delay(10);
}
}
})();
function delay(ms) {
return new Promise(resolve => {
setTimeout(() => resolve(), ms);
});
}