3

I'm using a Docker container for local web development. I have a bash script I use for running Django's manage.py test, flake8, and coverage html, all within a pipenv environment within the Docker container.

A simple version is:

#!/bin/bash
set -e

docker exec hines_web /bin/sh -c "pipenv run coverage run --source=. manage.py test ; pipenv run flake8 ; pipenv run coverage html"

This works. However, because each command has to separately use pipenv run it's slower than it needs to be.

Is there a way to either chain several commands together after a single pipenv run, or else to send multiple commands into pipenv shell for them to be run?

(Any thoughts on improving the general way of doing this appreciated!)

Phil Gyford
  • 13,432
  • 14
  • 81
  • 143
  • You should be able to use `pipenv` to set up a non-Docker host-based virtual environment. That would be generally faster and doesn't require administrator-equivalent permissions to use. – David Maze Dec 30 '20 at 21:52
  • I’m not 100% sure what you mean. Do you mean run pipenv directly on my Mac rather than in a Docker container? That’s what I’ve been doing previously and now I’m trying to use Docker so that I have a more easily replicable environment, aside from the pipenv aspect. – Phil Gyford Dec 30 '20 at 22:23
  • Do you need pipenv inside a docker container? you could directly install packages to the docker container `pipenv install --deploy --system` and then normally run `python3 filename.py` – Ishan Dec 31 '20 at 11:39
  • 1
    @Ishan That's a good point, and I started out that way, but I read that using `--system` within Docker images is no longer recommended: https://stackoverflow.com/questions/46503947/how-to-get-pipenv-running-in-docker/55610857#55610857 – Phil Gyford Dec 31 '20 at 11:59

1 Answers1

1

Here's my workaround which is laborious enough to make me think I'm approaching this entirely wrongly, but it works. Three steps:

1. A shell script that will run our commands. This is designed to be run within the pipenv environment. This is scripts/pipenv-run-tests.sh:

#!/bin/sh
set -e

# Runs the Django test suite, flake8, and generates HTML coverage reports.

# This script is called by using the shortcut defined in Pipfile:
#   pipenv run tests

# You can optionally pass in a test, or test module or class, as an argument, e.g.
# ./pipenv-run-tests.sh tests.appname.test_models.TestClass.test_a_thing
TESTS_TO_RUN=${1:-tests}

coverage run --source=. ./manage.py test $TESTS_TO_RUN
flake8
coverage html

2. We define a Custom Script Shortcut in our Pipfile:

[scripts]
tests = "./scripts/pipenv-run-tests.sh"

This means that if we log into the shell on the Docker container we could do: pipenv run tests and our scripts/pipenv-run-tests.sh would be run, within the pipenv environment. Any arguments included after pipenv run tests are passed on tot he script.

3. Finally we have a script designed to be run from the host machine, that runs our Custom Script Shortcut within Docker. This is scripts/run-tests.sh:

#!/bin/bash
set -e

# Call this from the host machine.
# It will call the `tests` shortcut defined in Pipfile, which will run
# a script within the pipenv environment.

# You can optionally pass in a test, or test module or class, as an argument, e.g.
# ./run-tests.sh tests.appname.test_models.TestClass.test_a_thing
TESTS_TO_RUN=${1:-tests}

docker exec hines_web /bin/sh -c "pipenv run tests $TESTS_TO_RUN"

So now, on the host machine, we can do: ./scripts/run-tests.sh and that will, within the Docker container, call the pipenv shortcut, which will run the scripts/pipenv-run-tests.sh script. Any arguments provided are passed on to the final script.

(Note that hines_web in the above is the name of my Docker container defined in my docker-compose.yml.)

Phil Gyford
  • 13,432
  • 14
  • 81
  • 143