15

So, I've got some Python code running inside a Docker container. I started up my local env using Google's gcloud script. I'm seeing basic access style logs and health check info, but I'm not sure how I can pass through log messages I'm writing from my Python app to the console. Is there a parameter I can set to accomplish this with my gcloud script or is there something I can set in the Dockerfile that can help?

Brandon
  • 2,886
  • 3
  • 29
  • 44
  • 3
    You can [attach](https://docs.docker.com/reference/commandline/cli/#attach) yourself to the running container or you can use [docker logs](https://docs.docker.com/reference/commandline/cli/#logs). Also you can attach yourself when starting a container with [docker run -a](https://docs.docker.com/reference/commandline/cli/#run). I Hope these information helps you. – joh.scheuer Dec 04 '14 at 20:16
  • 1
    share the `Dockerfile` out to get more support. Where is the log now inside containers ? generally print the logs to the console (stdout/stderr) inside container, then you can use `docker logs` outside. You can always use `docker exec` command to jump inside to check logs like normal app – Larry Cai Dec 08 '14 at 00:52
  • Thanks for the help guys. "docker logs" was what I was looking for. I think the part I was missing was how to get the running docker process IDs (docker ps), so I could feed that to the logs command. If either of you can write out your answer, I'll mark it as correct. – Brandon Dec 08 '14 at 21:29
  • Possible duplicate of [Python app does not print anything when running detached in docker](http://stackoverflow.com/questions/29663459/python-app-does-not-print-anything-when-running-detached-in-docker) – user985366 Nov 24 '16 at 14:08

2 Answers2

28

For Python to log on your terminal/command line/console, when executed from a docker container, you should have this variable set in your docker-compose.yml

  environment:
    - PYTHONUNBUFFERED=0

This is also a valid solution if you're using print to debug.

Sebastian Wozny
  • 16,943
  • 7
  • 52
  • 69
Sentient07
  • 1,270
  • 1
  • 16
  • 24
0

(Answer based on the comments)

You don't need to know the container ID if you wrap the app into docker-compose. Just add docker-compose.yml alongside your Dockerfile. It might sound as an extra level of indirection, but for a simple app it's as trivial as this:

version: "3.3"

services:
  build: .
  python_app:
    environment:
      - PYTHONUNBUFFERED=1

That's it. The benefit of having it is that you don't need to pass a lot of flags that docker require because they are added automatically. It also simplifies work with volumes and env vars if they become required later.

You can they view logs by service name:

docker-compose logs python_app

By the way, I'd rather set PYTHONUNBUFFERED=1 if I'm testing something locally. It disabled buffering, which makes logging more deterministic locally. I had a lot of logging problems, for example, when I tried to spin up grpc server in my python app because the logs that are flushed before the server starts were not all init logs I wanted to see. And once the server starts, you will not see the init logs because the logging reattaches to a different/spawned process.

yuranos
  • 8,799
  • 9
  • 56
  • 65