0

I'm using Android-Java and Firestore. I want to run multiple batched writes asynchronously so that if 1 batch write fails, all write batches must also fail. This is what I did,

btn_batch.setOnClickListener(new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            CollectionReference facultyRef = db.collection("Faculty");
            List<WriteBatch> writeBatches = new ArrayList<>();
            writeBatches.add(db.batch());
            int counter = 0, batchIndex = 0;

            for(int i=1; i<=10000; i++){
                String username = "user"+i;
                String password = UUID.randomUUID().toString().substring(0, 7);
                User user = new User(username, password);

                writeBatches.get(batchIndex).set(facultyRef.document(), user);
                counter++;

                if(counter == 499){
                    writeBatches.add(db.batch());
                    counter = 0;
                    batchIndex++;
                }
            }

            commitBatches(writeBatches);
        }
    });

private void commitBatches(List<WriteBatch> writeBatches){
    if(writeBatches.size() > 0){
        WriteBatch writeBatch = writeBatches.get(0);
        writeBatch.commit()
                .addOnCompleteListener(new OnCompleteListener<Void>() {
                    @Override
                    public void onComplete(@NonNull @NotNull Task<Void> task) {
                        if(task.isSuccessful()){
                            writeBatches.remove(0);
                            commitBatches(writeBatches);
                        }else{
                            Toast.makeText(MainActivity.this, task.getException().getMessage(), Toast.LENGTH_SHORT).show();
                        }
                    }
                });
    }else{
        Toast.makeText(this, "Batched Write Success", Toast.LENGTH_SHORT).show();
    }
}

My problem is that this code is not asynchronous therefore this code performs very slow and not reliable. Is there any way to run this code asynchronously? Please help me.

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
Vince Ybañez
  • 103
  • 10

1 Answers1

1

The batch operation is by definition atomic. This means that all operations succeed, or none of them are applied.

As I understand from your question, you need a batch of batches, which is actually not possible. If you have 500 operations in a batch and another 500 operations in another batch, those operations are considered atomic but separately. You cannot merge 1.000 operations in a single batch.

Edit:

WriteBatch#commit() method returns an object of type Task<Void>. This means that you can do something like this:

Task firstTask = firstWriteBatch.commit();
Task secondTask = secondWriteBatch.commit();

Task combinedTask = Tasks.whenAllSuccess(firstTask, secondTask).addOnSuccessListener(/* ... /*);

Or you can pass a list of Task objects.

Alex Mamo
  • 130,605
  • 17
  • 163
  • 193
  • So how can I write to more than 500 documents? – Vince Ybañez Sep 19 '22 at 13:51
  • 1
    You cannot add more than 500 operations into a single batch. If you need more than that, then you have to create multiple distinct batches, meaning that if you need to perform, for example, 10k operations, then you have to create 20 batches. But in that case, if one operation fails, for example, in the 20th batch, all the other 19 are already committed. There is nothing you can do about them. That's why most likely you should do that in a trusted environment, that's why I recommended you use Cloud Functions. – Alex Mamo Sep 19 '22 at 14:00
  • Let's say I have 10k operations and I divided them into 500 operations per batch and store each batch into an array of WriteBatch. How to commit those multiple batches at the same time? – Vince Ybañez Sep 19 '22 at 14:59
  • 1
    In that case, please check my updated answer. – Alex Mamo Sep 19 '22 at 15:05
  • 1
    I implemented your code and its working better than my recursive approach. Will it guarantee that all batched writes fail if at least 1 batched write fails? – Vince Ybañez Sep 19 '22 at 15:29
  • 1
    If you're using recursion, yes, it's one after another. In this way, you're getting it all at once. No, it's just another way of performing multiple operations at once. – Alex Mamo Sep 19 '22 at 15:31
  • Can I rollback all batched writes if 1 batched write fails? How? – Vince Ybañez Sep 19 '22 at 15:34
  • You cannot do that. – Alex Mamo Sep 19 '22 at 15:40