Does the google firestore database service provides a backup? If so, how do I backup the database and how do I restore in case of an error?
-
It's a beta product at the moment and does not yet offer any backup tool. So you'll have to write on yourself. Ref: https://groups.google.com/forum/#!topic/firebase-talk/5r3xeda07ek – Brahma Dev Oct 14 '17 at 16:17
-
2It's now GA since a few years, but still there is no backup option. This makes me skeptical of using Firestore in production apps. – Siddharth Kamaria Mar 14 '21 at 06:51
10 Answers
Update: It is now possible to backup and restore Firebase Firestore using Cloud Firestore managed export and import service
You do it by:
Create a Cloud Storage bucket for your project - Make sure it's a regional in us-central1 or 2 / multi regional type of bucket
Set up gcloud for your project using
gcloud config set project [PROJECT_ID]
EXPORT
Export all by calling
gcloud firestore export gs://[BUCKET_NAME]
Or Export a specific collection using
gcloud firestore export gs://[BUCKET_NAME] --collection-ids='[COLLECTION_ID_1]','[COLLECTION_ID_2]'
IMPORT
Import all by calling
gcloud firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
where [BUCKET_NAME] and [EXPORT_PREFIX] point to the location of your export files. For example - gcloud firestore import gs://exports-bucket/2017-05-25T23:54:39_76544/
Import a specific collection by calling:
gcloud firestore import --collection-ids='[COLLECTION_ID_1]','[COLLECTION_ID_2]' gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
Full instructions are available here: https://firebase.google.com/docs/firestore/manage-data/export-import

- 1
- 1

- 19,004
- 11
- 72
- 86
-
6Is there a way to backup database automatically every X hours? Basically how to I hook it up with a cron job? – codefx Apr 24 '19 at 00:10
-
1@codefx take a look at this https://firebase.google.com/docs/functions/schedule-functions – Bartek Pacia Apr 27 '19 at 16:42
-
I wrapped this into a npm package to be deployed to Firebase Cloud Functions. Check my answer out for details: https://stackoverflow.com/a/56618618/3763626 – crysxd Jun 16 '19 at 12:08
-
1This is now available in gcloud beta components. Documentation here: https://firebase.google.com/docs/firestore/manage-data/export-import. – RDRR Jul 08 '19 at 09:19
-
@codefx you can write a bash script to execute the gcloud command, then schedule it from your crontab on Ubuntu. Just be sure to set the PATH variable at the top of your crontab to allow it to execute gcloud commands. – RDRR Jul 08 '19 at 09:43
-
As per what @RDRR mentioned, please update your answer to prevent confusion for new visitors. – Kelvin Low Jul 29 '19 at 08:39
-
There's a page in the docs with explicit instructions for scheduling periodic exports of Firestore data https://firebase.google.com/docs/firestore/solutions/schedule-export – johnnycopes Nov 25 '21 at 14:51
Update July 2018: Cloud Firestore now supports managed import and export of data. See the documentation for more details:
https://firebase.google.com/docs/firestore/manage-data/export-import
[Googler here] No, right now we do not offer a managed backup or import/export service. This is something we will definitely offer in the future, we just did not get it ready for the initial Beta release.
The best way to back up right now is to write your own script using our Java/Python/Node.js/Go server SDKs, it should be fairly straightforward to download all documents from each collection and write them back if you need to.

- 24,624
- 13
- 93
- 124
-
3That's great. Thanks for commenting - Bonus points If someone could share their code (Node.js for extra bonus :) ) – Gal Bracha Oct 15 '17 at 14:15
-
1i have run into rate limits trying to restore large data sets. is there a timeline for the official backup and restore tool that ideally doesn't require an api request for each document write operation? – ForrestLyman Jun 30 '18 at 03:15
-
1@ForrestLyman Have you tried [batched writes](https://firebase.google.com/docs/firestore/manage-data/transactions#batched-writes)? – hakatashi Jul 17 '18 at 13:37
-
2
-
Is it possible to do daily backups, like available in Firebase RT DB? the firebase guide attached here describe how to do single backup,I would like to configure daily backup. – ykorach Feb 24 '19 at 09:24
-
@ykorach see this guide: https://firebase.google.com/docs/firestore/solutions/schedule-export – Sam Stern Feb 25 '19 at 16:14
-
Any news on supporting this natively, like the way Firebase Realtime Database does? – Jus10 Mar 10 '19 at 20:25
-
The backup counts as reads. I'm seeing 500+600% read count increase when scheduling it. There must be a better way of doing this.. it's expensive. – Oliver Dixon Nov 17 '19 at 14:42
https://www.npmjs.com/package/firestore-backup
Is a tool that has been created to do just this.
(I did not create it, just adding it here as people will find this question)

- 1,381
- 2
- 17
- 39
-
That tool doesn't seem to have any functionality to restore the Firestore DB though – martin36 Jul 30 '20 at 17:51
Local backups
This is the one I use for "one-off", local backups, and what I generally recommend. (most straight-forward if you want a single JSON file)
Drawbacks:
- Hasn't been updated in a long time.
Additional options: (not recommended)
Drawbacks:
- Backup only; cannot restore from the backups it creates.
- Hasn't been updated in a long time.
Drawbacks:
- Backup only; cannot restore from the backups it creates.
Cloud backups
- The official gcloud backup commands.
Drawbacks:
- The backup files are difficult/infeasible to parse. (update: how to convert to a json file)
- You have to set up the gcloud cli. (update: or use the cloud shell to run the commands)
- It doesn't backup locally; instead, it backs up to the cloud, which you can then download. (could also be considered an advantage, depending on what you want)
Note that for the gcloud backup commands, you have multiple options on how to schedule them to run automatically. A few options are shown here.

- 15,624
- 10
- 70
- 96
I am using the following work-around in order to have daily firestore backups:
I installed this globally: https://www.npmjs.com/package/firestore-backup-restore
I have a cron job that looks like this:
0 12 * * * cd ~/my/backup/script/folder && ./backup-script.sh
And my backup-script.sh looks like this:
#!/bin/sh
. ~/.bash_profile
export PATH=/usr/local/bin/
dt=$(/bin/date '+%d-%m-%Y %H:%M:%S');
echo "starting backup for $dt"
firestore-backup-restore -a ~/path/to/account/credentials/file.json -B ./backups/"$dt"

- 61
- 1
- 1
I've written a tool that traverses the collections/documents of the database and exports everything into a single json file. Plus, it will import the same structure as well (helpful for cloning/moving Firestore databases). It's published as an NPM package. Feel free to try it and give some feedback.

- 2,461
- 2
- 22
- 34
I had the same issue and created a small npm package which allows you to create a scheduled backup with Cloud Functions. It uses the new import/export feature of Firestore.
const firestoreBackup = require('simple-firestore-backup')
exports.firestore_backup = functions.pubsub.schedule('every 24 hours').onRun(firestoreBackup.createBackupHandler())
Checkout the full readme on how to set it up, it's super simple!

- 3,177
- 20
- 32
A solution using Python 2.
Fork it on https://github.com/RobinManoli/python-firebase-admin-firestore-backup
First install and setup Firebase Admin Python SDK: https://firebase.google.com/docs/admin/setup
Then install it in your python environment:
pip install firebase-admin
Install the Firestore module:
pip install google-cloud-core
pip install google-cloud-firestore
(from ImportError: Failed to import the Cloud Firestore library for Python)
Python Code
# -*- coding: UTF-8 -*-
import firebase_admin
from firebase_admin import credentials, firestore
import json
cred = credentials.Certificate('xxxxx-adminsdk-xxxxx-xxxxxxx.json') # from firebase project settings
default_app = firebase_admin.initialize_app(cred, {
'databaseURL' : 'https://xxxxx.firebaseio.com'
})
db = firebase_admin.firestore.client()
# add your collections manually
collection_names = ['myFirstCollection', 'mySecondCollection']
collections = dict()
dict4json = dict()
n_documents = 0
for collection in collection_names:
collections[collection] = db.collection(collection).get()
dict4json[collection] = {}
for document in collections[collection]:
docdict = document.to_dict()
dict4json[collection][document.id] = docdict
n_documents += 1
jsonfromdict = json.dumps(dict4json)
path_filename = "/mypath/databases/firestore.json"
print "Downloaded %d collections, %d documents and now writing %d json characters to %s" % ( len(collection_names), n_documents, len(jsonfromdict), path_filename )
with open(path_filename, 'w') as the_file:
the_file.write(jsonfromdict)

- 2,162
- 2
- 25
- 30
-
There is already a build in native solution for it that can backup / restore from the command line. – Gal Bracha Nov 09 '18 at 13:18
-
What is that built in native solution? Some people might not want to mess around with a cloud storage bucket, especially if they want to process the data. Don't see the reason for a downvote. In my case when I came here I would have preferred a python solution. – Robin Manoli Nov 18 '18 at 19:30
Here is my Android Java code for get backup easily for any fire store data collection
First use this method to read the collection data and store in it to serialized file in the mobile device storage
private void readCollection(){
ServerSide.db.collection("Collection_name")
.get()
.addOnCompleteListener(new OnCompleteListener<QuerySnapshot>() {
@Override
public void onComplete(@NonNull Task<QuerySnapshot> task) {
if (task.isSuccessful()) {
HashMap alldata = new HashMap();
for (QueryDocumentSnapshot document : task.getResult()) {
alldata.put(document.getId(),document.getData());
// ServerSide.db.collection("A_Sentences_test").document(document.getId())
// .set(document.getData());
}
try {
FileOutputStream fos = openFileOutput("filename.txt", Context.MODE_PRIVATE);
ObjectOutputStream os = new ObjectOutputStream(fos);
os.writeObject(alldata);
os.close();
fos.close();
Toast.makeText(MainActivity.this, "Stored", Toast.LENGTH_SHORT).show();
FileInputStream fis = openFileInput("filename.txt");
ObjectInputStream is = new ObjectInputStream(fis);
HashMap ad = (HashMap) is.readObject();
is.close();
fis.close();
Log.w("All data",ad+" ");
}catch (Exception e){
Log.w("errrrrrrrr",e+"");
}
} else {
Log.d("Colllllllllll", "Error getting documents: ", task.getException());
}
}
});
}
After that you can check the logcat whether the data is serialized correctly. and here is the restore code
private void writeData(){
try {
FileInputStream fis = openFileInput("filename.txt");
ObjectInputStream is = new ObjectInputStream(fis);
HashMap ad = (HashMap) is.readObject();
is.close();
fis.close();
for (Object s : ad.keySet()){
ServerSide.db.collection("Collection_name").document(s.toString())
.set(ad.get(s));
}
Log.w("ddddddddd",ad+" ");
}catch (Exception e){
e.printStackTrace();
}
}
Hope this would help

- 486
- 4
- 17
Question is old, projects are nice but I have some concerns about the backup.
1-For blaze plan users (free) official solution is off-limit.
2-Since Free users have 50k read quota per day that limit could be a problem in live and large databases.
3-As far as I examined most of the projects does not have a time limit or so, downloading same data every time it is run.
4-Wouldn't it be better to save collections as folders and every document as seperate file and and fetch only updated documents and replace file directly.
I will probably implement my own solution but just wondering your thoughts :)

- 281
- 4
- 9