I want to create a BigQuery table out of some files that are present in our organizations secured space of sharepoint. The files will be added on weekly basis and I need to setup a pipeline that I can use to ingest data in bigquery. I have been following manual process of loading the files by downloading the files and uploading to gcs bucket but that doesnt seems feasible anymore. Any help will be appreciated.
Asked
Active
Viewed 1,094 times
2
-
I believe one possible solution could be using [`BQ Client`](https://googleapis.dev/python/bigquery/latest/generated/google.cloud.bigquery.client.Client.html). Currently, I set up a pipeline to read daily updated excel sheets and load them directly to BigQuery using `BQ Client` and [`pygsheets`](https://pygsheets.readthedocs.io/en/stable/). – Jiho Choi May 23 '22 at 02:59
-
@Amit can you please post the solution that helped you out here? – Shashank Shekher Nov 11 '22 at 08:35