0

I see a lot of tutorials and samples for reading into BigQuery from Cloud Storage files, however, I need to read from BigQuery once a day and save the output to a new csv file in cloud storage using Dataflow. Could someone point me in the right direction?

I'm using Apache Beam with Python in a Jupyter notebook. So far, I can run a SQL query to get the data I need but I'm not sure what the best approach is to writing that out to a CSV in Cloud Storage.

Thanks in advance

AnalystIRL
  • 87
  • 1
  • 6
  • 1
    I think you will find this answer helpful: https://stackoverflow.com/a/52934251/7903159 However, if you need to save the whole table you can look into [BigQuery Export](https://cloud.google.com/bigquery/docs/exporting-data#exporting_table_data) and skip Apache Beam. – itroulli Jul 24 '20 at 16:13
  • Hi, could be useful if you share how you are trying to execute this task and if you are getting some error message, I recommend you take a look here: https://stackoverflow.com/help/how-to-ask – Harif Velarde Jul 24 '20 at 20:21

0 Answers0