I'm trying to connect to postgres running locally through a python script that's running in a container. I'm setting it up like this because in the future once I deploy I'll be using a managed database service that spins up postgres.
Ideally I'm able to define a postgres url in the .env
, docker-compose uses the env to set the environment variables and the application reads the postgres url.
Is this the best approach or if I should have a postgres container.
Best approach to achieve the flow I just described.
Tried passing POSTGRES_URL in the .env using localhost, public ip, local ip. e.g: postgresql://username:password@localhost:5432/dbname
version: "3.7"
services:
app:
build:
context: ./
dockerfile: Dockerfile
volumes:
- "./:/usr/src/app"
environment:
- ENV=${ENV}
- LOGS=${LOGS}
- AWS_REGION=${AWS_REGION}
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- POSTGRES_URL=${POSTGRES_URL}
Expected: Connection to the local database successful.
Actual: Is the server running on host <"ip"> and accepting TCP/IP connections on port 5432?