2

I am successfully exporting all needed collections from Firestore daily using a scheduled cloud function to the storage bucket. I can manually import the collection data into BigQuery using Create Table and choosing Google Cloud Storage as my data source, defining the location and that it is a Cloud Datastore Backup (file format). I can't seem to figure out how to create a scheduled version of this job ( I can rerun it manually from the job history ). Any help on figuring out how to automate these "create table" jobs would be appreciated!

  • Possible duplicate of [Cloud Functions for Firebase trigger on time?](https://stackoverflow.com/questions/42790735/cloud-functions-for-firebase-trigger-on-time) – robsiemb Nov 27 '19 at 16:38
  • You can also [schedule functions without firebase](https://cloud.google.com/scheduler/docs/tut-pub-sub) – robsiemb Nov 27 '19 at 16:39
  • Do you want to trigger this integration in a function? If yes, in which language? – guillaume blaquiere Nov 28 '19 at 08:02
  • Have you tried to load the firestore data through the CLI as mentioned [here](https://cloud.google.com/bigquery/docs/loading-data-cloud-firestore#loading_cloud_firestore_export_service_data)? You could create a little cron that imports it through this in linux – rsalinas Dec 04 '19 at 12:22
  • Yes, I am loading the firestore exported data into Big Query using that example. It is working fine. I just can't seem to figure out how to schedule it. It seems like a one off operation (although I can manually re-run it from the Job History list). – Kent Spadzinski Dec 30 '19 at 17:11

0 Answers0