11

I am considering using Google BigQuery as a back-end for Django but cannot be certain if this is possible, and if it is, what settings would apply.

Currently, my Django application uses Postgresql, and the code in settings.py is as follows:

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'mydatabase',
        'USER': 'mydatabaseuser',
        'PASSWORD': 'mypassword',
        'HOST': '127.0.0.1',
        'PORT': '5432',
    }
}

Ideally, I'd like to setup a database connection to Google BigQuery through settings.py and then use views and models as usual.

Amit
  • 307
  • 3
  • 6

4 Answers4

4

It's not possible, or at least not supported. You could use the API directly, but obviously you won't get any advantages of the ORM.

Tom Carrick
  • 6,349
  • 13
  • 54
  • 78
  • Django's ORM supports PostgreSQL, MySQL, SQLite and Oracle. There is third party support via `pyodbc` for SQL Server. – FlipperPA Sep 26 '17 at 21:18
  • Thanks Tom for the info. The purpose is to display aggregated summary data in form of charts for analytical purposes. – Amit Sep 29 '17 at 12:22
0

It is possible by using SQLAlchemy with Django.

SQLAlchemy can connect to bigquery with the pybigquery driver.

See the following about how to Configuring Django to use SQLAlchemy.

Alonme
  • 1,364
  • 15
  • 28
0

Everything is possible for sure. To create an interface would not be such a big job. But, I would just keep one note:

Bigquery is not intended to be backend database, rather it is more like data warehouse as it is defined within business intelligence discipline. This means, google will make it very hard for you to perform fast multi-user operations. As far as I can recall, update statements for example have have some thresholds.

On another hand, if this is purely for example data input, or visualisation of data, then why not. But then again, I think Azure power apps is kind of a product for it.

Donatas Svilpa
  • 291
  • 2
  • 3
-1

You should have a db motor like postgres, mysql, whatever... The point is, this db motor is necessary to have it, because the structure works in that way.

but of course, you can invoke google cloud from librarys in django and use it as

from google.oauth2 import service_account from google.auth.transport.requests import AuthorizedSession from google.cloud import datastore from google.cloud import bigquery

in my case I used to connect os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = 'OAuth2Credential.json'

for generate your .json you should go to the documentation in: https://cloud.google.com/iam/docs/creating-managing-service-account-keys

  • Hi, I know this is an old post, I just want to ask, I'm using the same thing as you, the JSON API key in order to use BQ API directly. However, my question is, during deployment to GCP App Engine, do you deploy the JSON together with all your other source code? Or does GCP App Engine automatically detects it for you according to the service account you use to deploy? – Owenn Jan 17 '22 at 04:36
  • Well, that depends, If your tool deployment is different to GCP tools or you are not using SDK on your development enviroment (for example) yes, you need to deploy your application with the .json – Andres Rave May 09 '22 at 22:42