19

I am new to Google Cloud and BigQuery

I have a cloud instance with 2 projects. One is a Compute Engine project and another is a BigQuery project. I can successfully get an Access Token and also a refresh token from the Compute Engine project. However, when I attempt to access the BigQuery project, the error below occurs:

xxxx@gmail.com does not have bigquery.jobs.create permission in project yyyy

I've created datasets in BigQuery, copied public data into each. I also confirmed the xxxx@gmail.com is the owner for each of the datasets. I am able to successfully query the datasets.

In IAM, the xxxx@gmail.com has the following:
roles/bigquery.user
roles/bigquery.admin
roles/bigquery.owner
roles/bigquery.viewer

I've reviewed the following links but the suggestions have not helped. Any ideas on how to resolve the error?

https://cloud.google.com/bigquery/docs/access-control

Google Big-query api 403-Forbidden Exception

Google API OAuth2, Service Account, "error" : "invalid_grant"

The user does not have bigquery.jobs.create permission in project gdelt-bq

Running queries in BigQuery without being a project User

https://github.com/getredash/redash/issues/1684

The user does not have bigquery.jobs.create permission in project gdelt-bq

Error :service account does not have bigquery.jobs.create permission

Although the error is not on the service account, I also reviewed: Error :service account does not have bigquery.jobs.create permission

The service accounts for both the Compute Engine project and the BigQuery 
project have the following: 
roles/bigquery.user 
roles/bigquery.admin 
roles/bigquery.data viewer 
roles/bigquery.data editor

Thanks!

m federico
  • 191
  • 1
  • 1
  • 4
  • Welcome to StackOverflow! Rather than make this correction in the comments, there is an "edit" button just below your post which you can use. – ricky3350 Jul 16 '18 at 17:17
  • 1
    @ ricky3350 Thank you. I've removed the correction and updated the original post with that information. I've also now deleted the correction – m federico Jul 16 '18 at 17:51
  • Please add permissions on the resource level first, then if this is not sufficient, on the project level. – Mr.TK Jul 26 '22 at 10:42

7 Answers7

17

The bigquery.user and bigquery.admin roles both contain the bigquery.jobs.create permission, so either should be sufficient.

Please make sure that user xxxx@gmail.com has those permissions on the project in which the job is being run (yyyy), which can differ from the project containing the data itself.

In particular, you can check the IAM policy on project yyyy by going here:

https://console.cloud.google.com/iam-admin/iam?project=yyyy

Once there, make sure that xxxx@gmail.com has one of the two roles in that project.

Jeremy Condit
  • 6,766
  • 1
  • 28
  • 30
  • Thank you. The project the job is being run is a Compute Engine. I've added the following in Iam to user xxxx@gmail.com Compute Admin, Compute Instance Admin (v1), Compute Network Admin; Owner was present previously. The error is still the same. – m federico Jul 17 '18 at 20:09
  • That's not needed. You can get the job's project ID from the error message (what you've written as "yyyy"). Substitute the real project ID in the link above to see the IAM settings for that project, and make sure user xxxx@gmail.com has the BQ user or admin role. – Jeremy Condit Jul 17 '18 at 23:49
  • Thank you. I will remove those if they are not needed. The yyyy project is the BigQuery project. We can call it Bigquery2. the xxxx@gmail.com account has the following assigned in the Bigquery2 IAM settings: BigQuery Admin BigQuery Data Editor BigQuery Data Viewer BigQuery User Owner – m federico Jul 18 '18 at 01:46
8

You need roles/bigquery.jobUser role on project level.

BigQuery Job User role "provides permissions to run jobs, including queries, within the project".

See more here.

Sal
  • 103
  • 1
  • 4
2

If you make service account in A project, and you add this service account into B Project as IAM Memeber(ex) role:bigquery.admin). when you execute insert query job on B project, you have to add 'project' parameter on client.query(project=B) or when you make client, you add 'proeject' parameter like this client = bigquery.Client.from_service_account_json(SERVICE_ACCOUNT_JSON_PATH, project='B')

EX1)

client = bigquery.Client.from_service_account_json(SERVICE_ACCOUNT_JSON_PATH)

job_config = bigquery.QueryJobConfig(dry_run=False, use_query_cache=True)

# Start the query, passing in the extra configuration.

query_job = client.query(
    (
        "INSERT INTO `{your_project}.{your_dataset}.{your_table}` (name, gender, count) "
        "values('test', 'F', 190), ('test', 'F', 190), ('test', 'F', 190) "
    ),
    job_config=job_config,
    project='B'
)

EX2)

client = bigquery.Client.from_service_account_json(SERVICE_ACCOUNT_JSON_PATH,project='B')

job_config = bigquery.QueryJobConfig(dry_run=False, use_query_cache=True)

# Start the query, passing in the extra configuration.

query_job = client.query(
    (
        "INSERT INTO `{your_project}.{your_dataset}.{your_table}` (name, gender, count) "
        "values('test', 'F', 190), ('test', 'F', 190), ('test', 'F', 190) "
    ),
    job_config=job_config,
)
Doosik Bae
  • 71
  • 4
0

There are 3 separate service accounts at the most. One the default dataflow service account, and the other the controller service account, and then the user account or the service account which you trigger the job.

On the above case, it is important to provide the required permission to the user account or service account triggering the job, and also the controller service account with the same permissions. If you don't use a controller service account that is passed to the dataflow job, then a default compute account and it's permission is taken.

Roshan Fernando
  • 493
  • 11
  • 31
0

Just got caught by this today... if your code doesn't compile Cloud Functions v1 (not sure about v2) will resort to the last "good" version, and defaults back to last good config so if you have it running under App Engine Default Service Account when you first ran it, it will default back to this and as such that service account doesn't have permissions to BQ.

Rob Hawthorne
  • 49
  • 1
  • 1
0

Firstly, don't try giving permissions to the gmail/workspace account you're signed in to firebase or BigQuery with. Permissions should be given to the service account since that's what will be connecting to the BQ server.

If you're using a service account from Firebase, follow the steps below.

  1. Search for Service Accounts in your Google Cloud console.
  2. Select the service account E-mail from the list shown. Service account should be the same as what you see under Firebase > Service Account > Firebase service account
  3. Edit the role for the service account and grant it Big Query Job User role.

That should solve your issue.

Adebola
  • 549
  • 1
  • 7
  • 11
-3

The key that you will download , make sure that you set the environment variable GOOGLE_CREDENTIALS_APPLICATION in the windows 10 by using tab advanced system settings. And use the same key location path in your program.

 spark.sqlContext.setGcpJsonKeyFile(jsonKeyFile)
    // Set up BigQuery project and bucket
    spark.sqlContext.setBigQueryProjectId("inductive-cocoa-250507")
    // Set up BigQuery dataset location, default is US
    spark.sqlContext.setBigQueryDatasetLocation("US")
    spark.sqlContext.setBigQueryGcsBucket("bucket-for-practice")
Tutu Kumari
  • 485
  • 4
  • 10