2

I'm just playing with a simple example to get a basic understanding of Docker going. Here is my Docker image file:

FROM python:3.7-alpine

# copy all the files to the container
COPY . /test
WORKDIR /test

# install dependencies
RUN pip install pip_system_certs --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org
RUN pip install -r requirements.txt



# run the command
CMD ["python", "./test_script.py"]

The trusted-host options are what allow us to get around corporate network security settings and install packages internally on windows and they seem to work in Docker too but only for some packages. For instance if my requirements.txt includes flask and requests everything is fine, but pandas and numpy give me

WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076)'))': /simple/numpy/

and fails. I think it's weird that this is working for some packages but not others.

Any help appreciated.

Using Docker Desktop in Windows 10.

Josh Gredvig
  • 41
  • 1
  • 4

1 Answers1

5

I know my company's big corporate proxy removes (most) normal certificates and re-wraps them in a self-signed cert. This caused lots of similar headaches for me. I resolved it by:

  • Figuring out what our root cert was by visiting an internet site in Chrome, clicking on the lock in the address bar, and viewing the certification path for the site's certificate. The root CA was our internal one.
  • Going to the certificate management in Windows control panel and under "Trusted Root Certification" found my company's internal root cert and exported it as a "Base-64 encoded X.509" file.
  • Copied that certificate file into my Docker container and added it as a CA certificate to the "os" inside my container. After that, everything I ran in my container just worked.

The catch with step 3 here is that exactly how you do this is different for different flavors of linux. I don't know much about alpine, but these links might get you pointed in roughly the right direction: https://blog.confirm.ch/adding-a-new-trusted-certificate-authority/
https://github.com/gliderlabs/docker-alpine/issues/260

Also, bonus catch - if you use python's requests library in your application, it doesn't use the system CA certs by default. If this is a problem for you, read about setting the REQUESTS_CA_BUNDLE in the accepted answer here: Python Requests - How to use system ca-certificates (debian/ubuntu)?

sql_knievel
  • 1,199
  • 1
  • 13
  • 26
  • What do you mean "Copied that certificate file into my Docker container"? You've added the installation of the certificate to the Dockerfile? – madhat1 May 12 '21 at 09:25
  • @madhat1 - The result of the step above it is the certificate exported as a "Base-64 encoded X.509" file. Let's say you saved it as "my_cert.pem" on your local drive. Now you need some way to reference that "my_cert.pem" file from within the Docker container - the easiest way is to copy the file into your Docker container by using the COPY command in your Dockerfile. Now that the cert file is inside your Docker container, you need to tell the "os" inside the container to trust it. That's beyond the scope of this answer, since it is somewhat os dependent. See the link I included as a start. – sql_knievel May 12 '21 at 17:08