0

I am using Docker to run my Django app and I have multiple images in it: PostgreSQL, celery_worker, celery_beat, and django_app. All of them except celery_beat are working fine. I am using this command to build docker:

docker-compose -f docker-compose.yml up -d --build

When I open Docker Desktop I see that it restarts all images, everything is working fine for a few seconds, then it exits celery-beat and returns in the celery-beat image with the following error:

[2022-10-13 15:34:59,440: WARNING/MainProcess] could not connect to server: Connection refused
    Is the server running on host "127.0.0.1" and accepting
    TCP/IP connections on port 5432?

I tried many solutions that I found on StackOverflow and other pages but with no success. 4 out of 5 images are working fine. When clicking on PostgreSQL image I see the following status:

2022-10-13 13:27:04.195 UTC [1] LOG:  database system is ready to accept connections

docker-compose.yml

version: '3.8'

services:
    db:
        image: postgres:14.0-alpine
        volumes:
            - postgres_data:/var/lib/postgresql/data/
        environment:
          - POSTGRES_USER=postgres
          - POSTGRES_PASSWORD=PF23_admin
          - POSTGRES_DB=postgres
        container_name: docerized_app_db_postgresql_dev
  
    app:
      build:
        context: ./backend
        dockerfile: Dockerfile
      restart: always
      command: python manage.py runserver 0.0.0.0:8000
      volumes:
        - ./backend/:/usr/src/backend/
      ports:
        - 8000:8000
      env_file:
        - ./.env
      depends_on:
        - db
      container_name: docerized_app_django_app_dev
    
    redis:
      image: redis:7-alpine
      ports:
        - "6379:6379" 
      container_name: docerized_app_redis_dev
        
    celery_worker:
      restart: always
      build:
        context: ./backend
      command: celery -A docerized_app_settings worker --loglevel=info --logfile=logs/celery.log
      volumes:
        - ./backend:/usr/src/backend
      env_file:
        - ./.env
      depends_on:
        - db
        - redis
        - app
      container_name: docerized_app_celery_worker_dev
  
    celery-beat:
      build: ./backend
      command: celery -A docerized_app_settings beat -l info
      volumes:
        - ./backend:/usr/src/backend
      env_file:
        - ./.env
      depends_on:
        - db
        - redis
        - app
      container_name: docerized_app_celery_beat_dev

volumes:
    postgres_data:

Docker is new to me so I am not sure how to fix it. When running my app in the terminal (using conda environment and ubuntu) everything is fine and I have no issues with the PostgreSQL connection.

Maybe it is a silly mistake and I am not aware of it. Any help will be appreciated.

Ismaili Mohamedi
  • 906
  • 7
  • 15
Adrian
  • 725
  • 4
  • 18

1 Answers1

1

So there are two possible problems.

  1. your postgres connection string within your docker container should use the container name of your pg instance (db) as the hostname. (maybe docerized_app_db_postgresql_dev. I've never specified explicitly the container name so I'm not 100% sure if it'll take that.)
  2. By default postgres configurations don't listen to external connections only local host, so in your postgres container you're going to have to change postgres.conf to listen to '*' rather than localhost.

https://www.bigbinary.com/blog/configure-postgresql-to-allow-remote-connection

Eric Yang
  • 2,678
  • 1
  • 12
  • 18
  • I'd recommend using the short Compose service name `db` as a host name. The `container_name:` will also work, but there's no particular reason to specify it (and the default name Compose will pick will be shorter than what's in the question). – David Maze Oct 13 '22 at 16:02