0

I have a python script that takes two arguments -input and -output and they are both directory paths. I would like to know first if this is a recommended use case of docker, and I would like to know also how to run the docker container by specifying custom input and output folders with the help of docker volume. My post is similar to this : Passing file as argument to Docker container. Still I was not able to solve the problem.

alpha027
  • 302
  • 2
  • 13
  • I routinely advise _not_ using Docker for programs that are not long-running and whose primary goal is reading and writing host files. A Python virtual environment would be a better match here. – David Maze Mar 31 '22 at 10:54
  • You already link to a very similar question; how is this question different? What have you already tried and what happens when you run it? – David Maze Mar 31 '22 at 10:54

1 Answers1

1

Its common practice to use volumes to persist data or mount some input data. See the postgres image for example.

docker run -d \
  --name some-postgres \
  -e PGDATA=/var/lib/postgresql/data/pgdata \
  -v /custom/mount:/var/lib/postgresql/data \
  postgres

You can see how the path to the data dir is set via env var and then a volume is mounted at this path. So the produced data will end up in the volume.

You can also see in the docs that there is a directory /docker-entrypoint-initdb.d/ where you can mount input scripts, that run on first DB setup.

In your case it might look something like this.

docker run  \
  -v "$PWD/input:/app/input" \
  -v "$PWD/output:/app/output"  \
  myapp --input /app/input --output /app/output 

Or you use the same volume for both.

docker run  \
  -v "$PWD/data:/app/data" \
  myapp --input /app/data/input --output /app/data/output 
The Fool
  • 16,715
  • 5
  • 52
  • 86