I cannot provide a better answer than the excellent one provided by @larsks but please, let me try giving you some ideas.
As @larsks also pointed out, any shell environment variable will take precedence over those defined in your docker-compose .env
file.
This fact is stated as well in the docker-compose documentation when taking about environment variables, emphasis mine:
You can set default values for environment variables using a .env
file,
which Compose automatically looks for in project directory (parent folder
of your Compose file). Values set in the shell environment override those
set in the .env
file.
This mean that, for example, providing a shell variable like this:
DB_USER= tommyboy docker-compose up
will definitively overwrite any variable you could have defined in your .env
file.
One possible solution to the problem is trying using the .env
file directly, instead of the environment variables.
Searching for information about your problem I came across this great article.
Among other things, in addition to explaining your problem too, it mentions as a note at the end of the post an alternative approach based on the use of the django-environ
package.
I was unaware of the library, but it seems it provides an alternative way for configuring your application reading your configuration directly from a configuration file:
import environ
import os
env = environ.Env(
# set casting, default value
DEBUG=(bool, False)
)
# Set the project base directory
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Take environment variables from .env file
environ.Env.read_env(os.path.join(BASE_DIR, '.env'))
# False if not in os.environ because of casting above
DEBUG = env('DEBUG')
# Raises Django's ImproperlyConfigured
# exception if SECRET_KEY not in os.environ
SECRET_KEY = env('SECRET_KEY')
# Parse database connection url strings
# like psql://user:pass@127.0.0.1:8458/db
DATABASES = {
# read os.environ['DATABASE_URL'] and raises
# ImproperlyConfigured exception if not found
#
# The db() method is an alias for db_url().
'default': env.db(),
# read os.environ['SQLITE_URL']
'extra': env.db_url(
'SQLITE_URL',
default='sqlite:////tmp/my-tmp-sqlite.db'
)
}
#...
If required, it seems you could mix the variables defined in the environment as well.
Probably python-dotenv would allow you to follow a similar approach.
Of course, it is worth mentioning that if you decide to use this approach you need to make accesible the .env
file to your docker-compose web service and associated container, perhaps mounting and additional volume or copying the .env
file to the web
directory you already mounted as volume.
You still need to cope with the PostgreSQL container configuration, but in a certain way it could help you achieve the objective you pointed out in your comment because you could use the same .env
file (certainly, a duplicated one).
According to your comment as well, another possible solution could be using Docker secrets.
In a similar way as secrets works in Kubernetes, for example, as explained in the official documentation:
In terms of Docker Swarm services, a secret is a blob of data, such
as a password, SSH private key, SSL certificate, or another piece
of data that should not be transmitted over a network or stored
unencrypted in a Dockerfile or in your application’s source code.
You can use Docker secrets to centrally manage this data and
securely transmit it to only those containers that need access to
it. Secrets are encrypted during transit and at rest in a Docker
swarm. A given secret is only accessible to those services which
have been granted explicit access to it, and only while those
service tasks are running.
In a nutshell, it provides a convenient way for storing sensitive data across Docker Swarm services.
It is important to understand that Docker secrets is only available when using Docker Swarm mode.
Docker Swarm is an orchestrator service offered by Docker, similar again to Kubernetes, with their differences of course.
Assuming you are running Docker in Swarm mode, you could deploy your compose services in a way similar to the following, based on the official docker-compose docker secrets example:
version: '3'
services:
postgres:
image: postgres:10.5
ports:
- 5105:5432
environment:
POSTGRES_DB: directory_data
POSTGRES_USER: /run/secrets/db_user
POSTGRES_PASSWORD: password
secrets:
- db_user
web:
restart: always
build: ./web
ports: # to access the container from outside
- "8000:8000"
environment:
DEBUG: 'true'
SERVICE_CREDS_JSON_FILE: '/my-app/credentials.json'
DB_SERVICE: host.docker.internal
DB_NAME: directory_data
DB_USER_FILE: /run/secrets/db_user
DB_PASS: password
DB_PORT: 5432
command: /usr/local/bin/gunicorn directory.wsgi:application --reload -w 2 -b :8000
volumes:
- ./web/:/app
depends_on:
- postgres
secrets:
- db_user
secrets:
db_user:
external: true
Please, note the following.
We are defining a secret named db_user
in a secrets
section.
This secret could be based on a file or computed from standard in, for example:
echo "tommyboy" | docker secret create db_user -
The secret should be exposed to every container in which it is required.
In the case of Postgres, as explained in the section Docker secrets
in the official Postgres docker image description, you can use Docker secrets to define the value of POSTGRES_INITDB_ARGS
, POSTGRES_PASSWORD
, POSTGRES_USER
, and POSTGRES_DB
: the name of the variable for the secret is the same as the normal ones with the suffix _FILE
.
In our use case we defined:
POSTGRES_USER_FILE: /run/secrets/db_user
In the case of the Django container, this functionality is not supported out of the box but, due to the fact you can edit your settings.py
as you need to, as suggested for example in this simple but great article you can use a helper function to read the required value in your settings.py
file, something like:
import os
def get_secret(key, default):
value = os.getenv(key, default)
if os.path.isfile(value):
with open(value) as f:
return f.read()
return value
DB_USER = get_secret("DB_USER_FILE", "")
# Use the value to configure your database connection parameters
Probably this would make more sense to store the database password, but it could be a valid solution for the database user as well.
Please, consider review this excellent article too.
Based on the fact that the problem seems to be caused by the change in your environment variables in the Django container one last thing you could try is the following.
The only requirement for your settings.py
file is to declare different global variables with your configuration. But it didn't say nothing about how to read them: in fact, I exposed different approaches in the answer, and, after all, is Python and you can use the language to fill your needs.
In addition, it is important to understand that, unless in your Dockerfile you change any variables, when both the Postgres and Django containers are created the will receive exactly the same .env
file with exactly the same configuration.
With these two things in mind you could try creating a Django container local copy of the provided environment in your settings-py
file and use it between restarts or between whatever reason is causing the variables to change.
In your settings.py
(please, forgive me for the simplicity of the code, I hope you get the idea):
import os
import ast
env_vars = ['DB_NAME', 'DB_USER', 'DB_PASS', 'DB_SERVICE', 'DB_PORT']
if not os.path.exists('/tmp/.env'):
with open('/tmp/.env', 'w') as f:
for env_var in env_vars:
f.write(env_var)
f.write('=')
f.write(os.environ[env_var])
f.write('\n')
with open('/tmp/.env') as f:
cached_env_vars = f.read()
cached_env_vars_dict = ast.literal_eval(cached_env_vars)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': cached_env_vars_dict['DB_NAME'],
'USER': cached_env_vars_dict['DB_USER'],
'PASSWORD': cached_env_vars_dict['DB_PASS'],
'HOST': cached_env_vars_dict['DB_SERVICE'],
'PORT': cached_env_vars_dict['DB_PORT']
}
#...
}
I think any of the aforementioned approches is better, but certainly it will ensure environment variables consistency accross changes in the environment and container restarts.