0

I have the following code:

let idx = 0;
    for (const e of parsedData) {
      try {
        const datastore = new Datastore({
          namespace: 'bdlight',
          projectId: e.credential.project_id,
          credentials: {
            type: e.credential.type,
            private_key: e.credential.private_key,
            token_url: e.credential.token_uri,
            client_email: e.credential.client_email,
          },
        });

        this.logger.log(
          `Getting Registration - CNS: ${
            e.cns
          } - CNPJCPF: ${documentNumber} - ${idx + 1}/${parsedData.length}`,
        );

        const query = datastore
          .createQuery('geral')
          .filter('CNPJCPF', '=', documentNumber);

        const [result] = await datastore.runQuery(query);

        registrations.push(...(result ? result : []));
      } catch {
        this.logger.log('Error CNS: ' + e.cns);

        errors.push('Erro no CNS: ' + e.cns);
      } finally {
        idx++;
      }
    }

The parsedDate has more than 300 credentials, when I run this on a pod on Kubernetes, I get a memory leak error, I have a pod with 4096 of RAM. Can I run the garbage collector after each iteration?

I tried to set datastore as null after each iteration.

  • What is the actual error you're getting word for word because I don't see any actual leak here? Can you show more context for how this function is used? – jfriend00 Mar 19 '23 at 21:38
  • This statement `const [result] = await datastore.runQuery(query);` should allow the garbage collector to run if it decides that it needs to because that pauses the execution of this code and returns control to the event loop. – jfriend00 Mar 19 '23 at 21:44
  • You'll probably be better off creating the datastore object outside the loop. However, given that you are storing all the results in your registration array, that's likely your largest source of memory use. – Jim Morrison Mar 19 '23 at 21:54

0 Answers0