8

I want to create a new collection and add thousands of documents sized ~ 1-2K to it. I already have data in json so I thought this would be easy.

I understand that batch can have 500 writes at a time so to break it into chunks of 500 I wrote the following code. Though for testing purpose I am running it with chunks of 20 and my test json has 72 objects.

But I keep getting the following error

node_modules\@google-cloud\firestore\src\write-batch.js:148
  throw new Error('Cannot modify a WriteBatch that has been committed.');
  ^

Error: Cannot modify a WriteBatch that has been committed.

My code is as follows

var dataObj = JSON.parse(fs.readFileSync('./bigt.json'))
var tmpdd = dataObj.slice(0, 72)
var batch = db.batch();

console.log(tmpdd.length)

let tc = tmpdd.length
let lc = 0
let upperLimit = 20, dd = null

while(lc<=tc){

    dd = tmpdd.slice(lc, upperLimit )

    console.log(lc, upperLimit)
    dd.map(
    o => batch.set(db.collection('nseStocks').doc(o.Date+o.variable), o)
    )

    batch.commit().then(function () {
        console.log('Written to firestore', lc, lc + upperLimit)
    })
    .catch(
        (err) => console.log('Fail', err)
    )

    lc = upperLimit
    upperLimit = upperLimit + 20 

}

Also it's weird that batch doesn't seem to be committed in every iteration of the loop. Ideally I would let Firestore determine document ids but apparently batch does not have add function.

I have tried adding documents in a loop instead of doing batch writes. But it gives me timeout error after adding a few documents. And of course it's not practical for large number of documents.

You could tell I am very new to Firestore and it's my second day playing with it.

Please let me know if there are any obvious mistakes or better ways of achieving this seemingly simple task.

Thanks

Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
kaustubh shinde
  • 473
  • 2
  • 5
  • 16

2 Answers2

7

You're creating a single batch for all writes at the top level of your program. It's getting reused for all the calls to batch.set() that you make for all your batch writes.

var batch = db.batch();

Instead, you should create a new batch for each set of writes. You can do that at the top of your while loop:

while(lc<=tc) {
    var batch = db.batch();
    // use the new batch here
}
Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
3

This works for me:

function loadJson(){
    var ref = firebase.firestore().collection("my-parent-collection-name");
    getJSON("https://my-target-domain.com/my-500-plus-objects-file.json").then(function (data) {
        var counter = 0;
        var commitCounter = 0;
        var batches = [];
        batches[commitCounter] = db.batch();
        Object.keys(data).forEach(function(k, i) {
            if(counter <= 498){
                var thisRef = ref.doc(k);
                batches[commitCounter].set(thisRef, data[k]);
                counter = counter + 1;
            } else {
                counter = 0;
                commitCounter = commitCounter + 1;
                batches[commitCounter] = db.batch();
            }
        });
        for (var i = 0; i < batches.length; i++) {
            batches[i].commit().then(function () {
                console.count('wrote batch');
            });
        }
    }, function (status) {
        console.log('failed');
    });

}
loadJson();
Ronnie Royston
  • 16,778
  • 6
  • 77
  • 91