I do a query like this:
var snapshots = await this.db
.collection('${AppInit.chatsref}')
.where('members', arrayContains: uid)
.orderBy('updated_at', descending: true)
.get();
// Check cache.
print(snapshots.metadata.isFromCache ? "Cached" : "Not Cached");
I will always get the 'Not cached' So my cache only works if I specifically set source to cache
var snapshots = await this.db
.collection('${AppInit.chatsref}')
.where('members', arrayContains: uid)
.orderBy('updated_at', descending: true)
.get(GetOptions(source: Source.cache));
// Check cache.
print(snapshots.metadata.isFromCache ? "Cached" : "Not Cached");
Then I will get content from the cache. I thought firebase firestore handles this. Do I need to always set the source? This happens for both the firestore future(get) and stream(snapshot).
I want firestore to use cache as much as possible. It's a chatting app. So I hate that the app stays at the loading state even for a second: