1

I have a CSV file with more than 600K registers and I need to write them to Cloud Firestore.

I tried with:

Promise.all(data.map(docId => {
  return db
    .collection(PROMO_CODES)
    .doc(docId)
    .set({
      name: RULE_NAME
    })
    .then(() => docId)
    .catch(() => [docId]);
}))

But I got a memory leak. Any idea what is the best way to do this?

Alex Mamo
  • 130,605
  • 17
  • 163
  • 193
  • I am not sure what the question is here; are you asking about the memory leak? Or are you asking if it's wise to load 600k worth of data into your devices memory at once (no) or what is causing the memory leak or how to write that many documents in spite of the memory leak or.. ? Is there an option to simply divide up the document into smaller chunks and upload those? – Jay Feb 04 '23 at 17:00

1 Answers1

0

Trying to write 600K documents at once in Firestore, most likely will not succeed, due to such a huge number of documents. To solve that, you should consider writing the data in smaller chunks. Besides that, your code tries to write the data into a single document, which will also not succeed because a single document is limited to a maximum of 1 Mib of storage.

So the best option that you have is to write each record in the CSV file as a separate document in a Firestore collection. Besides that, since you're using Node.js, you should also take into consideration using BulkWriter, which:

A Firestore BulkWriter that can be used to perform a large number of writes in parallel.

So I think that you should take advantage of that.

Alex Mamo
  • 130,605
  • 17
  • 163
  • 193