0

I try to access my own data tables stored on Google BigQuery in my Google Colab sheet (with a R runtime) by running the following code:

# install.packages("bigrquery")
library("bigrquery")
bq_auth(path = "mykeyfile.json")
projectid = "work-366734"
sql <- "SELECT * FROM `Output.prepared_data`"

Running

tb <- bq_project_query(projectid, sql)

results in the following access denied error:

Access Denied: BigQuery BigQuery: Permission denied while globbing file pattern. [accessDenied]

For clarification, I already created a service account (under Google Cloud IAM and admin), gave it the roles ‘BigQuery Admin’ and ‘BigQuery Data Owner’, and extracted the above-mentioned json Key file ‘mykeyfile.json’. (as suggested here)

Additionally, I added the Role of the service account to the dataset (BigQuery – Sharing – Permissions – Add Principal), but still, the same error shows up…

Of course, I already reset/delete and reinitialized the runtime.

Am I missing giving additional permissions somewhere else? Thanks!


Not sure if it is relevant, but I add it just in case: I also tried the authentication process via

bq_auth(use_oob = TRUE, cache = FALSE)

which opens an additional window, where I have to allow access (using my Google Account, which is also the Data Owner) and enter an authorization code. While this steps works, bq_project_query(projectid, sql) still gives the same Access Denied error.

Trying to authorize access to Google BigQuery using python and the following commands, works flawless (using the same account/credentials).

from google.colab import auth
auth.authenticate_user()
project_id = "work-366734"
client = bigquery.Client(project=project_id)
df = client.query( '''
   SELECT
     *
   FROM
     `work-366734.Output.prepared_data`
 ''' ).to_dataframe()
Ben
  • 45
  • 8

0 Answers0