1

I am trying to connect to a postgreQSL-database initialized within a Dockerized Django project. I am currently using the python package psycopg2 inside a Notebook in Jupyter to connect and add/manipulate data inside the db.

With the code:

connector = psycopg2 .connect(
    database="postgres",
    user="postgres",
    password="postgres",
    host="postgres",
    port="5432")

It raises the following error:

OperationalError: could not translate host name "postgres" to address: Unknown host

Meanwhile, It connects correctly to the local db named postgres with host as localhost or 127.0.0.1, but it is not the db I want to access. How can I connect from Python to the db? Should I change something in the project setup?

You can find the Github repository here. Many thanks!

docker-compose.yml:

version: '3.8'

services:
  web:
    restart: always
    build: ./web
    expose:
      - "8000"
    links:
      - postgres:postgres
      - redis:redis
    volumes:
      - web-django:/usr/src/app
      - web-static:/usr/src/app/static
    env_file: .env
    environment:
      DEBUG: 'true'
    command: /usr/local/bin/gunicorn docker_django.wsgi:application -w 2 -b :8000

  nginx:
    restart: always
    build: ./nginx/
    ports:
      - "80:80"
    volumes:
      - web-static:/www/static
    links:
      - web:web

  postgres:
    restart: always
    image: postgres:latest
    hostname: postgres
    ports:
      - "5432:5432"
    volumes:
      - pgdata:/var/lib/postgresql/data/
    environment:
      POSTGRES_DB: postgres
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres

  pgadmin:
    image: dpage/pgadmin4
    depends_on:
      - postgres
    ports:
      - "5050:80"
    environment:
      PGADMIN_DEFAULT_EMAIL: pgadmin4@pgadmin.org
      PGADMIN_DEFAULT_PASSWORD: admin
    restart: unless-stopped

  redis:
    restart: always
    image: redis:latest
    ports:
      - "6379:6379"
    volumes:
      - redisdata:/data

volumes:
  web-django:
  web-static:
  pgdata:
  redisdata:

Dockefile:

FROM python:3.7-slim

RUN python -m pip install --upgrade pip

COPY requirements.txt requirements.txt
RUN python -m pip install -r requirements.txt

COPY . .

Edit

To verify that localhost is not the correct hostname I tried to visualize the tables inside PgAdmin (which connects to the correct host), and psycopg2:

The (correct) tables of pgadmin: enter image description here

The (incorrect) tables of psycopg2: enter image description here

BloomShell
  • 833
  • 1
  • 5
  • 20
  • 1
    `host` is either a ip address or a dns name that can be resolved to an ip address. So if you haven't configured "postgres" in your own personal dns, its not going to find it. – Shmack Jun 16 '22 at 18:49
  • @Shmack I also tried to `docker inspect ` and tried to connect both with `Gateway`and `IPAddress`. Nothing seems to work. – BloomShell Jun 16 '22 at 18:55
  • I'm assuming the django project and the database are in the same docker container? – Shmack Jun 16 '22 at 18:58
  • Yes, different services in same container. – BloomShell Jun 16 '22 at 19:00
  • So you're trying to connect to the database inside a docker container from a jupyter notebook, presumably on a different machine? – Shmack Jun 16 '22 at 19:03
  • Currently, the jupyter notebook is in the same machine (also same directory of the docker_app project). Then, when I will deploy the website, I will need to (maybe?) expose and publish at same time the postgreSQL in order to connect from outside. – BloomShell Jun 16 '22 at 19:06
  • I think docker has a private ip address that gets created for the container. So what you'll want to do is find that containers ip address and use it as the host in your notebook. – Shmack Jun 16 '22 at 19:08
  • 1
    If you are using the Jupyter notebook on your host machine and it's and not part of the network that your docker-compose is creating and you have `ports: - "5432:5432"` declared on your postgres service, than the postgres instance you are connecting to via localhost:5432 in Jupyter IS the postgres service you have defined in your compose file. the `ports` configuration exposes the service's ports on your local network. See this [helpful graphic](https://stackoverflow.com/a/54883938/878272) for the difference between `ports` and `expose`. – THX1138 Jun 16 '22 at 19:32
  • Then, you are suggesting that using `host=localhost` then psycopg2 will access the db inside the postgres image within the container? – BloomShell Jun 16 '22 at 19:58
  • yes because you are connecting to it on your host network. The hostname `postgres` only applies inside the network that `docker-compose up` [generates by default](https://docs.docker.com/compose/networking/). As a side note you can get rid of all the `links` properties because all your services are already accessible to each other on that default network. – THX1138 Jun 16 '22 at 21:35
  • @THX1138 I added an edit to give more info about the case – BloomShell Jun 17 '22 at 08:00
  • Also, as suggested [here](https://github.com/tiangolo/uwsgi-nginx-flask-docker/issues/46) by tiangolo: "you shouldn't connect to the Postgres DB with localhost but with the name of the Postgres container". But when I do it, it raise error `OperationalError: could not translate host name "postgres" to address: Unknown host` – BloomShell Jun 17 '22 at 12:49
  • @THX1138 You are indeed correct. After a while, I have decided to delete my PostgreSQL 14 from my apps in my local machine. Then, tried again to connect with `psycopg2.connect('postgres://postgres:postgres@localhost:5432/postgres')` and now it show me correctly the tables of the postgreSQL within Docker. Maybe, I suppose that both database where on localhost:5432 (with same conf) and were in conflict. But, supposing that one day I will deploy this docker app, can you give me some insights on how I will modify the `docker-compose.yml` in order to connect from outside with psycopg2? – BloomShell Jun 17 '22 at 13:34
  • 1
    Deploying it depends on where you are deploying it AWS, Google Cloud etc. If you just mean connecting from the local machine. you should be able to do localhost:5432 already. But the best way is to just add your jupyter notebook server as a service in your docker-compose. That way you don't even need to connect to postgres on localhost because the notebook is part of the network and thus you can connect like `psycopg2.connect('postgres://postgres:postgres@localhost:5432/postgres')` – THX1138 Jun 17 '22 at 19:59

0 Answers0