I am running pyspark in local mode, and I need to connect to bigquery. I have found this: https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example but they focus on dataproc, and my spark is set up on a local machine.
Could someone please help me understand at a high level, in points, what exactly are the things I need to set up the connection and query the data into dataframes?
Thank you