1

I cloned a Django+Node.js open-source project, the goal of which is to upload and annotate text documents, and save the annotations in a Postgres db. This project has stack files for docker-compose, both for Django dev and production setups. Both these stack files work completely fine out of the box, with a Postgres database.

Now I would like to upload this project to Google Cloud - as my first ever containerized application. As a first step, I simply want to move the persistent storage to Cloud SQL instead of the included Postgres image in the stack file. My stack-file (Django dev) looks as follows

version: "3.7"
services:

  backend:
    image: python:3.6
    volumes:
      - .:/src
      - venv:/src/venv
    command: ["/src/app/tools/dev-django.sh", "0.0.0.0:8000"]
    environment:
      ADMIN_USERNAME: "admin"
      ADMIN_PASSWORD: "${DJANGO_ADMIN_PASSWORD}"
      ADMIN_EMAIL: "admin@example.com"
      # DATABASE_URL: "postgres://doccano:doccano@postgres:5432/doccano?sslmode=disable"
      DATABASE_URL: "postgres://${CLOUDSQL_USER}:${CLOUDSQL_PASSWORD}@sql_proxy:5432/postgres?sslmode=disable"
      ALLOW_SIGNUP: "False"
      DEBUG: "True"
    ports:
      - 8000:8000
    depends_on:
      - sql_proxy
    networks:
      - network-overall

  frontend:
    image: node:13.7.0
    command: ["/src/frontend/dev-nuxt.sh"]
    volumes:
      - .:/src
      - node_modules:/src/frontend/node_modules
    ports:
      - 3000:3000
    depends_on:
      - backend
    networks:
      - network-overall

  sql_proxy:
    image: gcr.io/cloudsql-docker/gce-proxy:1.16
    command:
      - "/cloud_sql_proxy"
      - "-dir=/cloudsql"
      - "-instances=${CLOUDSQL_CONNECTION_NAME}=tcp:0.0.0.0:5432"
      - "-credential_file=/root/keys/keyfile.json"
    volumes:
      - ${GCP_KEY_PATH}:/root/keys/keyfile.json:ro
      - cloudsql:/cloudsql
    networks:
      - network-overall

volumes:
  node_modules:
  venv:
  cloudsql:

networks:
  network-overall:

I have a bunch of models, e.g. project in the Django backend, which I can view, modify, add and delete using Django admin interface, but while trying to access them through Node.js views I get a 403 Forbidden error. This is the case of all my Django models.

For reference, in the above stack file, I have listed the only difference from the originally cloned Docker-compose stack file, where the DATABASE_URL used to point to a local Postgres Docker image, as follows

  postgres:
    image: postgres:12.0-alpine
    volumes:
      - postgres_data:/var/lib/postgresql/data/
    environment:
      POSTGRES_USER: "doccano"
      POSTGRES_PASSWORD: "${POSTGRES_PASSWORD}"
      POSTGRES_DB: "doccano"
    networks:
      - network-backend

To check if my GCP keys are correct, I tried to deploy the Cloud SQL Proxy container alone and interact with it (add, remove and update rows in included tables), and that was possible. However, the fact that I can use the Django admin interface successfully in the deployed Docker-compose stack should already prove that things are ok with the Cloud SQL proxy.

I'm not an experienced Node.js developer by any means, and have a little experience with Django and Django admin. My intention behind using a Docker-compose setup was that I will not have to bother with the intricacies of js views, and only have to deal with the Python business logic.

louis_guitton
  • 5,105
  • 1
  • 31
  • 33
Saahil
  • 11
  • 1

0 Answers0