I've an Angular application, connected to a firebase DB. The application will be displaying marker on a google maps, and the goal is to have it working as a Progressive Web App, with offline capabilities.
I was counting on AngularFirestore
to handle the offline part.
First test worked nicely, but the issue is that when I imported my real data, I got chrome which Paused before potential out-of-memory crash
.
Here is my current Service that get my Spots
:
import { Injectable } from '@angular/core';
import * as fromSpots from './spots.reducer';
import { Store } from '@ngrx/store';
import { Subscription } from 'rxjs';
import { AngularFirestore } from '@angular/fire/firestore';
import { map, take } from 'rxjs/operators';
import { Spot } from './spot.model';
import { SetSpots } from './spots.actions';
@Injectable({
providedIn: 'root'
})
export class SpotsService {
private firebaseSubs: Subscription[] = [];
constructor(
private db: AngularFirestore,
private store: Store<fromSpots.State>
) { }
fetchSpots() {
this.firebaseSubs.push(
this.db
.collection('spots')
.snapshotChanges()
.pipe(map(docArray=>{
return docArray.map(doc=>{
return {
id: doc.payload.doc.id,
...doc.payload.doc.data()
} as Spot;
})
})) .subscribe((spots: Spot[])=>{
console.log(spots);
this.store.dispatch(new SetSpots(spots))
},
error =>{
//TODO
console.error(error);
}
)
);
}
}
I think it crashes while doing the map(because I don't get the console.error part).
Some metrics: ~30'000 documents, for a total size of 185MB(I know that's a lot, didn't thought it will be that big initially).
So my question is how to handle offline data coming from big database such as firebase?