Was wondering if there are any common practices in backup up a firebase DB. My concern is some process accidentally wiping out our Database.
Thanks!
Was wondering if there are any common practices in backup up a firebase DB. My concern is some process accidentally wiping out our Database.
Thanks!
As of the time of this question, Firebase backs up all instances daily. So while keeping your own backups may still be useful, it's not essential.
To create your own backups, you can simply curl the data:
curl https://<instance>.firebaseio.com/.json?format=export
Note that for multiple gigabytes of data, this will slow things down and lock read access for a short period. It would be better in this case to chunk the backups and work with smaller portions. The shallow parameter can help here by providing a list of keys for any given path in Firebase, without having to fetch the data first.
curl https://<instance>.firebaseio.com/.json?shallow=true
As previously mentioned, there are also several GitHub libs available for this, and incremental backups are practical with some creativity and a worker thread on the real-time SDK.
There are now "Import Data" and "Export Data" buttons on the data page of the web interface for every project, so you can now backup your data with a button click!
just yesterday wrote a shell-script, which utilizes firebase-tools (npm install -g firebase-tools
), in order to have these database dumps contained within my regular backup cronjob
:
#!/bin/bash
# $1 is the Firebase projectId.
# $2 is the destination directory.
# example usage: cron_firebase.sh project-12345 /home/backups/firebase
# currently being triggered by /etc/cron.hourly/firebase-hourly.cron
PROJECTID=$1
DESTINATION=$2
FIREBASE="$(which firebase)"
NOW="$(date +"%Y-%m-%d_%H%M")"
cd $DESTINATION
$FIREBASE --project $PROJECTID database:get / > ./$PROJECTID.$NOW.json
tar -pczf $PROJECTID.$NOW.tar.gz ./$PROJECTID.$NOW.json && rm ./$PROJECTID.$NOW.json
update: in the meanwhile, one can auto backup to Google Cloud Storage Bucket
...goto Firebase Console
-> Realtime Database
-> and click tab Backups.
It is now possible to backup and restore Firebase Firestore using Cloud Firestore managed export and import service
You do it by:
Set up gcloud for your project using gcloud config set project [PROJECT_ID]
EXPORT
Export all by calling
gcloud alpha firestore export gs://[BUCKET_NAME]
Or Export a specific collection using
gcloud alpha firestore export gs://[BUCKET_NAME] --collection-ids='[COLLECTION_ID_1]','[COLLECTION_ID_2]'
IMPORT
Import all by calling
gcloud alpha firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
where [BUCKET_NAME] and [EXPORT_PREFIX] point to the location of your export files. For example - gcloud alpha firestore import gs://exports-bucket/2017-05-25T23:54:39_76544/
Import a specific collection by calling:
gcloud alpha firestore import --collection-ids='[COLLECTION_ID_1]','[COLLECTION_ID_2]' gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
Full instructions are available here: https://firebase.google.com/docs/firestore/manage-data/export-import
Just to expand @kato's answer using curl.
I was looking for ways to run the command every night. My solution:
1) created a compute engine (basically a VM) in Google Cloud. You might be familiar with EC2 if you are from AWS world.
2) Wrote a simple cronjob, something like this
0 23 * * * /usr/bin/curl https://yourdatabaseurl.com/.json?format=export -o /tmp/backuptest_`date +\%d\%m\%y`.bk
I am sure there might be a simpler way to do this within the free tier itself. Like using cloud functions.