0

I have a code that records camera input from client into server as such :

async function saveIntoMp4(chunks) {
    const options = {
        type: "video/webm"
    }
    let blob = new Blob(chunks, options);
    chunks.length = 0 // to stop any memory leaks
    const buffer = Buffer.from(await blob.arrayBuffer());
    try {
        fs.writeFile(
            `./videos/1.mp4`,
            buffer,
            () => console.log("video is saved!")
        );
    } catch (error) {
        console.log(error)
    }
}

However buffer take up the memory and doesn't let go of it on my Ubuntu 20.04.6 LTS. Everytime I run this function it keeps occupying more and more memory. I have no instance of console.log from the buffer or chunks or blob or any kind. I am not sure why the memory is not released and what I should do about it. I have used nodejs version 20 and 18.9 so far without any luck. Why blob.arrayBuffer doesn't release the used memory?

Hypothesis
  • 1,208
  • 3
  • 17
  • 43
  • 1
    As you use chunks, you're repeatedly calling the saveIntoMp4() function I believe, and each time it creates a new buffer, the memory usage will keep increasing until the garbage collector is triggered. Garbage Collector will not release the memory immediately, it might take some time. Btw, rather than buffering the entire file in memory, you could consider using a streaming approach to process and save the video data. – OrkhanGG Jun 21 '23 at 14:10
  • Have you redacted anything such as referring to/saving the blob reference in the writeFile callback or a console log of the buffer? Also recommend you inspect in Chrome devtools and take heap snapshots and check since that should GC between, or force GC https://stackoverflow.com/a/30654451/414062 but chrome will be better to explore the memory (you need to run node process with --inspect flag) – Dominic Jun 21 '23 at 14:13
  • No the function that calls saveIntoMp4() happens only on one instance each time. Chunks are pushed into a chunk array separately. The only I can see no leakage is by removing await blob.arrayBuffer() from the code which is debilitating – Hypothesis Jun 21 '23 at 14:13
  • 1
    @Dominic the same code works fine with no leakage on Windows. On Ubuntu I have tried to servers from 2 different providers and a person vmachine. On all I see the massive leakage. The entire arraybuffer remains in memory. – Hypothesis Jun 21 '23 at 14:14

1 Answers1

-1

The issue you're experiencing with the memory not being released in your code is likely related to the usage of the Buffer.from() method. In Node.js, the Buffer class represents binary data and is stored in memory. When you use Buffer.from() to create a new buffer from the arrayBuffer of the blob, it allocates memory to hold that data.

async function saveIntoMp4(chunks) {
  const options = {
    type: "video/webm"
  };
  const blob = new Blob(chunks, options);
  chunks.length = 0; // to stop any memory leaks

  const arrayBuffer = await blob.arrayBuffer();
  const buffer = Buffer.from(arrayBuffer);
  
  try {
    fs.writeFile(
      `./videos/1.mp4`,
      buffer,
      () => {
        console.log("video is saved!");
        buffer.buffer = null; // Explicitly release the memory
      }
    );
  } catch (error) {
    console.log(error);
  }
}

By releasing the memory using buffer.buffer = null, you ensure that the memory is properly deallocated, allowing it to be garbage collected by Node.js.

Ramesh Kumar
  • 455
  • 3
  • 10
  • 4
    Did you write this yourself? https://meta.stackoverflow.com/questions/421831/temporary-policy-generative-ai-e-g-chatgpt-is-banned?cb=1 – AKX Jun 21 '23 at 14:14
  • for content only not for code. – Ramesh Kumar Jun 21 '23 at 14:18
  • I cannot use allocUnsafe instead of from because I get TypeError [ERR_INVALID_ARG_TYPE]: The "size" argument must be of type number. Received an instance of ArrayBuffer – Hypothesis Jun 21 '23 at 20:27
  • use buffer.buffer = null; for releasing the memory occupied by the buffer in you code. in place of buffer.fill(0); – Ramesh Kumar Jun 22 '23 at 02:25