I'm trying to figure out if there's a reasonable way of doing this:
My problem:
Exceeding my daily quota for reads in firestore pretty fast.
My database and what I do:
My database looks like this (simplified):
sessions: { // collection
sessionId: { // document
users: { // collection
userId: { // document
id: string
items: { // collection
itemId: trackObject
}
}
}
}
}
Now I want to retrieve from one session, all users and their items. Most sessions have 2-3 users but some users have around 3000 items. I basically want to retrieve an array like this:
[
{
userId,
items: [
...items
],
},
...users
]
How I go about it currently:
So I get all users:
const usersRef = db.collection(`sessions/${sessionId}/users`);
const userSnapshots = await usersRef.get();
const userDocs = userSnapshots.docs;
Then for each user I retrieve their items:
(I use a for-loop which can be discussed but anyhow)
const user = userDocs[i].data();
const itemsRef = usersRef.collection(`${user.id}/items`);
const itemSnapshots = await itemRef.get();
const items = itemSnapshots.docs
Finally I retrieve the actual items through a map:
user.items = items.map(doc => doc.data());
return user;
My theory:
So it looks like if I do this on a session where a user has 3000 items, the code will perform 3000 read operations on firestore. After just 17 runs I eat up my 50000 operations a day.
This reasoning is somewhat based on this answer.
My question:
Is there any other way of doing this? Like getting all tracks in one read-call? Should I see if I can fit all the items into an array-key in the user-object instead of storing as a collection? Is the free version of firestore simply not designed for this many documents being retrieved in one go?