238

I'd like to run in a local environment a Python script which is normally run in a Docker container. The docker-compose.yml specifies an env_file which looks (partially) like the following:

DB_ADDR=rethinkdb
DB_PORT=28015
DB_NAME=ipercron

In order to run this locally, I would like these lines to be converted to

os.environ['DB_ADDR'] = 'rethinkdb'
os.environ['DB_PORT'] = '28015'
os.environ['DB_NAME'] = 'ipercron'

I could write my parser, but I was wondering if there are any existing modules/tools to read in environment variables from configuration files?

Islam
  • 3,654
  • 3
  • 30
  • 40
Kurt Peek
  • 52,165
  • 91
  • 301
  • 526

13 Answers13

354

I use Python Dotenv Library. Just install the library pip install python-dotenv, create a .env file with your environment variables, and import the environment variables in your code like this:

import os
from dotenv import load_dotenv

load_dotenv()

MY_ENV_VAR = os.getenv('MY_ENV_VAR')

From the .env file:

MY_ENV_VAR="This is my env var content."

This is the way I do when I need to test code outside my docker system and prepare it to return it into docker again.

Timbergus
  • 3,167
  • 2
  • 36
  • 35
ParisNakitaKejser
  • 12,112
  • 9
  • 46
  • 66
  • 4
    I also use `dotenv`. When you come from from a JS backend environment, there is such nice similarities that the learning curve is almost flat! – swiss_knight Apr 10 '20 at 11:54
  • if I have `MY_ENV_VAR=""` in the `.bashrc` file this does not for and returns empty string; is this a normal behavior? – alper Nov 18 '20 at 21:59
  • 3
    @alper yes it is. It'll only override environment varibles already set in `.bashrc` if you use `load_dotenv(override=True)`, as described in their [GitHub readme file](https://github.com/theskumar/python-dotenv#variable-expansion). – Daniel Lavedonio de Lima Mar 14 '21 at 02:46
  • 3
    Generally in .env files quotes aren't required for the variables. – Alex May 18 '21 at 02:27
  • @DanielLavedoniodeLima Does `load_dotenv(override=True)` prefered over `load_dotenv()`? – alper Jul 02 '21 at 23:03
  • @alper depends if you want to override the value from `.bashrc` (set override to `False` or leave it blank) or from the `.env` file (set override to `True`). I don't know if there's a standard for that or not. – Daniel Lavedonio de Lima Jul 03 '21 at 04:56
  • 3
    I would like to add that according to the docs `Lines can start with the export directive, which has no effect on their interpretation.` Very nice. Means you can do `export MY_VAR="Env var"` and be compatible with `source .env` from bash. Maybe it's useful to add this information to the answer. I would suggest edit but queue is full – CervEd Nov 21 '21 at 12:43
  • One-liner: `__import__('dotenv').load_dotenv()` – ahmelq Mar 23 '22 at 13:08
  • Flask inherently uses dotenv, so if you're moving away from Flask and `.env` loading is breaking that's why. See [this](https://stackoverflow.com/questions/58409892/automatically-loading-environment-variables-in-a-flask-app) for more details. – Pro Q May 09 '22 at 23:23
  • 4
    For other noobs like me, your file actually needs to be named `.env`. It won't read from `example.env` or anything like that. – blue_chip Sep 29 '22 at 15:34
  • 1
    @blue_chip From the dotenv docs, you can use `dotenv_values` to specify path to a different `.env` file: https://saurabh-kumar.com/python-dotenv/#other-use-cases. Take care of the difference with `load_dotenv`: "*The function `dotenv_values` works more or less the same way as `load_dotenv`, except it doesn't touch the environment, it just returns a dict with the values parsed from the .env file.*" – Gino Mempin Oct 04 '22 at 22:25
44

If your system/environment/workflow supports using shell scripts, you can create a script that wraps these 2 operations:

  1. Sourcing the .env file and exporting them as environment variables
    • Using the set -a option where "Each variable or function that is created or modified is given the export attribute and marked for export to the environment of subsequent commands".
  2. Calling your Python script/app that has plain os.environ.get code

Sample .env file (config.env):

TYPE=prod
PORT=5000

Sample Python code (test.py):

import os

print(os.environ.get('TYPE'))
print(os.environ.get('PORT'))

Sample bash script (run.sh):

#!/usr/bin/env bash

set -a
source config.env
set +a

python3 test.py

Sample run:

$ tree
.
├── config.env
├── run.sh
└── test.py

$ echo $TYPE

$ echo $PORT

$ python3 test.py
None
None

$ ./run.sh 
prod
5000

When you run the Python script directly (python3 test.py) without source-ing the .env file, all the environ.get calls return None.

But, when you wrap it in a shell script that first loads the .env file into environment variables, and then runs the Python script afterward, the Python script should now be able to read the environment variables correctly. It also ensures that the exported env vars only exist as part of the execution of your Python app/script.

As compared with the other popular answer, this doesn't need any external Python libraries.

Gino Mempin
  • 25,369
  • 29
  • 96
  • 135
  • 3
    This is a real linux solution for this problem I think. – Ehsan88 Mar 31 '21 at 12:17
  • 1
    This was exactly what I needed. The `set -a` trick is really elegant and will come handy in a number of other scenarios. Thanks! – András Aszódi May 26 '21 at 14:53
  • 1
    Note that `export MY_VAR="Hello!"` is compatible with `dotenv` as well as `bash source`. Nice – CervEd Nov 21 '21 at 12:45
  • ...special characters like `!` in the exported env var value is compatible with `source` as long as it's properly quoted. See https://stackoverflow.com/q/55703950/2745495 – Gino Mempin Feb 21 '22 at 10:44
22

This could also work for you:

env_vars = [] # or dict {}
with open(env_file) as f:
    for line in f:
        if line.startswith('#') or not line.strip():
            continue
        # if 'export' not in line:
        #     continue
        # Remove leading `export `, if you have those
        # then, split name / value pair
        # key, value = line.replace('export ', '', 1).strip().split('=', 1)
        key, value = line.strip().split('=', 1)
        # os.environ[key] = value  # Load to local environ
        # env_vars[key] = value # Save to a dict, initialized env_vars = {}
        env_vars.append({'name': key, 'value': value}) # Save to a list

print(env_vars)

In the comments, you'll find a few different ways to save the env vars and also a few parsing options i.e. to get rid of the leading export keyword. Another way would be to use the python-dotenv library. Cheers.

UPDATE: I setup my own envvar_utils.py to handle conversion from string etc..

"""Utility functions for dealing with env variables and reading variables from env file"""
import os
import logging
import json

BOOLEAN_TYPE = 'boolean'
INT_TYPE = 'int'
FLOAT_TYPE = 'float'
STRING_TYPE = 'str'
LIST_TYPE = 'list'
DICT_TYPE = 'dict'


def get_envvars(env_file='.env', set_environ=True, ignore_not_found_error=False, exclude_override=()):
    """
    Set env vars from a file
    :param env_file:
    :param set_environ:
    :param ignore_not_found_error: ignore not found error
    :param exclude_override: if parameter found in this list, don't overwrite environment
    :return: list of tuples, env vars
    """
    env_vars = []
    try:

        with open(env_file) as f:
            for line in f:
                line = line.replace('\n', '')

                if not line or line.startswith('#'):
                    continue

                # Remove leading `export `
                if line.lower().startswith('export '):
                    key, value = line.replace('export ', '', 1).strip().split('=', 1)
                else:
                    try:
                        key, value = line.strip().split('=', 1)
                    except ValueError:
                        logging.error(f"envar_utils.get_envvars error parsing line: '{line}'")
                        raise

                if set_environ and key not in exclude_override:
                    os.environ[key] = value

                if key in exclude_override:
                    env_vars.append({'name': key, 'value': os.getenv(key)})
                else:
                    env_vars.append({'name': key, 'value': value})
    except FileNotFoundError:
        if not ignore_not_found_error:
            raise

    return env_vars


def create_envvar_file(env_file_path, envvars):
    """
    Writes envvar file using env var dict
    :param env_file_path: str, path to file to write to
    :param envvars: dict, env vars
    :return:
    """
    with open(env_file_path, "w+") as f:
        for key, value in envvars.items():
            f.write("{}={}\n".format(key, value))
    return True


def convert_env_var_flag_to(env_var_name, required_type, default_value):
    """
    Convert env variable string flag values to required_type
    :param env_var_name: str, environment variable name
    :param required_type: str, required type to cast the env var to
    :param default_value: boolean, default value to use if the environment variable is not available
    :return: environment variable value in required type
    """
    env_var_orginal_value = os.getenv(env_var_name, default_value)
    env_var_value = ""
    try:
        if required_type == INT_TYPE:
            env_var_value = int(env_var_orginal_value)
        elif required_type == FLOAT_TYPE:
            env_var_value = float(env_var_orginal_value)
        elif required_type == BOOLEAN_TYPE:
            env_var_value = bool(int(env_var_orginal_value))
        elif required_type == STRING_TYPE:
            env_var_value = str(env_var_orginal_value)
        elif required_type == LIST_TYPE:
            env_var_value = env_var_orginal_value.split(',') if len(env_var_orginal_value) > 0 else default_value
        elif required_type == DICT_TYPE:
            try:
                env_var_value = json.loads(env_var_orginal_value) if env_var_orginal_value else default_value
            except Exception as e:
                logging.error(f"convert_env_var_flag_to: failed loading {env_var_orginal_value} error {e}")
                env_var_value = default_value
        else:
            logging.error("Unrecognized type {} for env var {}".format(required_type, env_var_name))

    except ValueError:
        env_var_value = default_value
        logging.warning("{} is {}".format(env_var_name, env_var_orginal_value))

    return env_var_value
radtek
  • 34,210
  • 11
  • 144
  • 111
  • 2
    Woudn't saving them into a `dict` be better approac rather than into a list? // +1 to get rid of leading `export` keyword – alper Nov 18 '20 at 22:21
  • This is really quite a dirty way of doing it and will require lot of cleansing to actually work... strings from the dotenv will be read 'as is', i.e. they will keep their ' '. Also, if the dotenv has blank spaces around the =, these will be part of the key – KingOtto Dec 28 '21 at 13:02
  • It works well for years for me now. env vars are strings so yes you would have to convert them as needed, as with any env var. – radtek Jan 04 '22 at 20:54
14

You can use ConfigParser. Sample example can be found here.

But this library expects your key=value data to be present under some [heading]. For example, like:

[mysqld]
user = mysql  # Key with values
pid-file = /var/run/mysqld/mysqld.pid
skip-external-locking
old_passwords = 1
skip-bdb      # Key without value
skip-innodb
Moinuddin Quadri
  • 46,825
  • 13
  • 96
  • 126
  • 4
    The main drawback of this solution is the file can't be parsed by a shell with `source ` if an header is added. – Dereckson Jul 29 '19 at 12:35
14

Dewald Abrie posted a good solution.

Here's a slight modification that ignores breaklines (\n)

def get_env_data_as_dict(path: str) -> dict:
    with open(path, 'r') as f:
       return dict(tuple(line.replace('\n', '').split('=')) for line
                in f.readlines() if not line.startswith('#'))

print(get_env_data_as_dict('../db.env'))
Gino Mempin
  • 25,369
  • 29
  • 96
  • 135
Martin Nowosad
  • 791
  • 8
  • 15
  • I think this is a great answer, clean, practical, and without external dependencies. I haven't checked with complex needs, but it is very convenient in the case where one has a local repo and need to provide all the code but not the variable content (A token for example), just follow these general steps: 1) created a .env file, 2) Place it in your virtual environment folder, 3) include virtualenv folder in .gitignore, 4) read the variable with provided function in any script and won't be public in the repo but just in your local machine. – IF.Francisco.ME May 29 '21 at 00:24
  • 1
    This doesnt work for me, I get this error: `ValueError: dictionary update sequence element #2 has length 1; 2 is required` – Chud37 Sep 19 '22 at 20:33
  • Nice code, however it won't work with comments at the end of a line (after data), right? (This is probably also the case for other answers.) – jakob.j Dec 19 '22 at 21:15
  • @jakob.j no, but you should add the comments above the code anyways. If you want comments gone you could handle this with regex – Martin Nowosad Dec 21 '22 at 21:01
13

How about this for a more compact solution:

import os

with open('.docker-compose-env', 'r') as fh:
    vars_dict = dict(
        tuple(line.replace('\n', '').split('='))
        for line in fh.readlines() if not line.startswith('#')
    )

print(vars_dict)
os.environ.update(vars_dict)
Arash Hatami
  • 5,297
  • 5
  • 39
  • 59
Dewald Abrie
  • 1,392
  • 9
  • 21
  • 4
    Nice. Your solution adds \n to the value though, as it is the end of the line. A little adjustment: `def get_env_data_as_dict(path: str) -> dict: with open(path, 'r') as f: return dict(tuple(line.replace('\n', '').split('=')) for lin in f.readlines() if not line.startswith('#'))` – Martin Nowosad Oct 06 '20 at 16:03
10

Using only python std

import re

envre = re.compile(r'''^([^=]+)\s+?=\s+?(?:[\s"']*)(.+?)(?:[\s"']*)$''')
result = {}
with open('/etc/os-release') as ins:
    for line in ins:
        match = envre.match(line)
        if match is not None:
            result[match.group(1)] = match.group(2)
h0tw1r3
  • 6,618
  • 1
  • 28
  • 34
2

I think you should leave it for external tools to manage the environment for you.

This way you can easily use secret manager like 1password cli to get the environment variables loaded from encrypted vault like so

op run --env-file=.env -- python your_script.py

Having that said load_dotenv is smart enough not to load .env variables if they are present in the environment but some of the other solutions aren't.

And if you don't have any external tool at your disposal just use 'bash':

set -o allexport; source .env; set +o allexport

Solution taken from: Set environment variables from file of key/value pairs

Piotr Czapla
  • 25,734
  • 24
  • 99
  • 122
2

python-decouple is a good option for these types of files. While it is often used for django applications, it is extensible for arbitrary environment files.

from decouple import Config, RepositoryEnv

env=Config(RepositoryEnv('.env'))

env.get('DB_ADDR') #=> 'rethinkdb'
env.get('DB_PORT') #=> '28015'
env.get('DB_NAME') #=> 'ipercron'
conmak
  • 1,200
  • 10
  • 13
2

I would not recommend directly reading .env files from within your program. The idea of the config part in 12 factor app is that you read the config from the environment variables -- i.e. not from files.

.env files are just a convenient way to get those variables into the environment (see below). If you start reading the files directly in your code, you're short cutting this step over the environment variables with all consequences and are basically back to square one, reading config from files.

So what you should do instead, use a tool such as dotenv-cli that reads .env files, exports the variables into the environment and runs your app with the temporary modified environment, like so:

$ dotenv yourapp
Bastian Venthur
  • 12,515
  • 5
  • 44
  • 78
1

In situations where using the python-dotenv wasn't possible, I've used something like the following:

import os

def load_env_file(dotenv_path, override=False):
    with open(dotenv_path) as file_obj:
        lines = file_obj.read().splitlines()  # Removes \n from lines

    dotenv_vars = {}
    for line in lines:
        line = line.strip()
        if not line or line.startswith("#") or "=" not in line:
            continue

        key, value = line.split("=", maxsplit=1)
        dotenv_vars.setdefault(key, value)

    if override:
        os.environ.update(dotenv_vars)
    else:
        for key, value in dotenv_vars.items():
            os.environ.setdefault(key, value)

It reads the given file and parses lines that have the "=" symbol in them. Value before the symbol will be the key, and value after is the value.

Current environment variables with the same keys as in the env file can either be left untouched or overwritten with the override parameter.

Tom
  • 11
  • 2
  • Thanks for this, I adapted it for my answer (which aims to allow comments after data in the same line). I upvoted of course. :) – jakob.j Dec 19 '22 at 21:38
0

I'm not very proud of this, however, in contrast to other answers, it seems to work if one has lines which end with a comment (after data in the same line). Adapted @Tom's answer.

# have to parse manually since dotenv package is not available
def get_env_data_as_dict(dotenv_path):
    result = {}
    with open(dotenv_path) as file_obj:
        lines = file_obj.read().splitlines()  # Removes \n from lines

    for line in lines:
        line = line.strip()
        if not line or line.startswith("#") or "=" not in line:
            continue
        if "#" in line:
            line = line.split("#")[0].strip()
        key, value = line.split("=", maxsplit=1)
        result[key] = value
    return result
slaesh
  • 16,659
  • 6
  • 50
  • 52
jakob.j
  • 942
  • 13
  • 28
0

If you are using a Jupyter Notebook you can use this magic and point to the .env file.

%reload_ext dotenv
%dotenv path/to/env_vars.env

Documentation here.

ayush thakur
  • 438
  • 3
  • 9