-1

Background- I've deployed a docker image (https://hub.docker.com/r/digitorus/eramba-db) to Kubernetes. By pulling the image and using the below .yaml file to deploy to a separate namespace (eramba-1)

    apiVersion: apps/v1
    kind: Deployment
    metadata:
      name: eramba
      namespace: eramba-1
      labels:
         app: eramba               
    spec:
      replicas: 1
      selector:
        matchLabels:
          app: eramba
      template:
        metadata:
          labels:
            app: eramba
        spec:
          containers:
          - name: eramba
            image: docker.io/digitorus/eramba:latest
            ports:
            - containerPort: 80

PROBLEM - there's a problem when accessing the application via the IP:port I’ve also checked the <exposed: 80> port in the sockerfile, the same port is seen in the describe pod output. Tried running a port forwarding rule to forward 80 to 8888. When accessing localhost:8888 via the browser, an error is seen below. I still have to config services to expose the Application.

root@osboxes:/home/osboxes/manifests# kubectl port-forward --namespace eramba-1 pod/eramba-7455b5bb8-fnw7v 8888:80
Forwarding from 127.0.0.1:8888 -> 80
Forwarding from [::1]:8888 -> 80
Handling connection for 8888
E0106 22:31:17.651396   32811 portforward.go:406] an error occurred forwarding 8888 -> 80: error forwarding port 80 to pod 659bd926eb494d74aec7bd4b86b3d1f293e42aececff758f7a565d708078d0d6, uid : exit status 1: 2022/01/06 22:31:17 socat[32837] E connect(5, AF=2 127.0.0.1:80, 16): Connection refused
E0106 22:31:17.652649   32811 portforward.go:234] lost connection to pod

I also noticed the pod is crashing a lot (CrashLoopBackOff). Here's the describe pod output

root@osboxes:/home/osboxes/manifests# kubectl describe pods eramba -n eramba-1
Name:         eramba-7455b5bb8-fnw7v
Namespace:    eramba-1
Priority:     0
Node:         osboxes/172.16.42.135
Start Time:   Thu, 06 Jan 2022 21:35:22 -0500
Labels:       app=eramba
              pod-template-hash=7455b5bb8
Annotations:  <none>
Status:       Running
IP:           10.20.0.55
IPs:
  IP:           10.20.0.55
Controlled By:  ReplicaSet/eramba-7455b5bb8
Containers:
  eramba:
    Container ID:   docker://ea6b299219ff4477e7a6992d36e9d7ca8a4fa4eb1aec83b31db93e4599f1f91d
    Image:          docker.io/digitorus/eramba:latest
    Image ID:       docker-pullable://digitorus/eramba@sha256:5f27f83fb46e4760f51bed9c8b8f8abecf1c11402595ca87ecee78ccc017a532
    Port:           80/TCP
    Host Port:      0/TCP
    State:          Waiting
      Reason:       CrashLoopBackOff
    Last State:     Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Thu, 06 Jan 2022 22:29:19 -0500
      Finished:     Thu, 06 Jan 2022 22:32:34 -0500
    Ready:          False
    Restart Count:  13
    Environment:    <none>
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-x5fnh (ro)
Conditions:
  Type              Status
  Initialized       True 
  Ready             False 
  ContainersReady   False 
  PodScheduled      True 
Volumes:
  kube-api-access-x5fnh:
    Type:                    Projected (a volume that contains injected data from multiple sources)
    TokenExpirationSeconds:  3607
    ConfigMapName:           kube-root-ca.crt
    ConfigMapOptional:       <nil>
    DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
  Type     Reason            Age                  From               Message
  ----     ------            ----                 ----               -------
  Warning  FailedScheduling  59m (x31 over 90m)   default-scheduler  0/1 nodes are available: 1 node(s) had taint {node.kubernetes.io/not-ready: }, that the pod didn't tolerate.
  Normal   Scheduled         58m                  default-scheduler  Successfully assigned eramba-1/eramba-7455b5bb8-fnw7v to osboxes
  Normal   Pulled            58m                  kubelet            Successfully pulled image "docker.io/digitorus/eramba:latest" in 709.654139ms
  Normal   Pulled            55m                  kubelet            Successfully pulled image "docker.io/digitorus/eramba:latest" in 706.000948ms
  Normal   Pulled            52m                  kubelet            Successfully pulled image "docker.io/digitorus/eramba:latest" in 689.074695ms
  Normal   Pulled            49m                  kubelet            Successfully pulled image "docker.io/digitorus/eramba:latest" in 677.201277ms
  Normal   Started           49m (x4 over 58m)    kubelet            Started container eramba
  Normal   Killing           45m (x4 over 55m)    kubelet            Stopping container eramba
  Normal   Pulling           45m (x5 over 58m)    kubelet            Pulling image "docker.io/digitorus/eramba:latest"
  Normal   Pulled            45m                  kubelet            Successfully pulled image "docker.io/digitorus/eramba:latest" in 762.977828ms
  Normal   Created           45m (x5 over 58m)    kubelet            Created container eramba
  Warning  BackOff           18m (x48 over 32m)   kubelet            Back-off restarting failed container
  Normal   SandboxChanged    105s (x17 over 55m)  kubelet            Pod sandbox changed, it will be killed and re-created.

kubectl version output

root@osboxes:/home/osboxes/manifests# kubectl version
Client Version: version.Info{Major:"1", Minor:"23", GitVersion:"v1.23.1", GitCommit:"86ec240af8cbd1b60bcc4c03c20da9b98005b92e", GitTreeState:"clean", BuildDate:"2021-12-16T11:41:01Z", GoVersion:"go1.17.5", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"23", GitVersion:"v1.23.1", GitCommit:"86ec240af8cbd1b60bcc4c03c20da9b98005b92e", GitTreeState:"clean", BuildDate:"2021-12-16T11:34:54Z", GoVersion:"go1.17.5", Compiler:"gc", Platform:"linux/amd64"}

The logs output

root@osboxes:/home/osboxes/manifests# kubectl logs eramba-7455b5bb8-hw2k7 -n eramba-1
[i] pre-exec.d - processing /scripts/pre-exec.d/010-apache.sh
tail: can't open '/var/log/apache2/*log': No such file or directory
[i] pre-exec.d - processing /scripts/pre-exec.d/020-eramba-initdb.sh
[i] Waiting for database to setup...
[i] Trying to connect to database: try 1...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 2...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 3...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 4...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 5...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 6...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 7...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 8...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 9...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 10...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 11...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 12...
ERROR 2005 (HY000): Unknown MySQL server host 'db' (-3)
[i] Trying to connect to database: try 13...
David Maze
  • 130,717
  • 29
  • 175
  • 215
Bryan
  • 67
  • 7
  • check pod logs using `kubectl logs ` – Harsh Manvar Jan 07 '22 at 04:23
  • I've posted the logs output. It looks like the container may be broken? – Bryan Jan 07 '22 at 04:29
  • yes could be possible i am not sure you are running but it;s failing due to connection with with db. maybe eramba trying to connect with db – Harsh Manvar Jan 07 '22 at 04:45
  • Not familiar with docker or containers in general, but if an application needs a database for it to run (like in this case), would the container packages all the required dependencies and database runtime env (mysql)? Or would I have to provision another pod just for the database runtime? – Bryan Jan 07 '22 at 04:49
  • yes you can need to have database container if application want, instead you can first try running application like Nginx docker iamge for testing if you are learning k8s. – Harsh Manvar Jan 07 '22 at 04:58
  • You ware trying to access the application from within the cluster or outside the cluster? if it is outside, design recomentation says that you should indeed have a service for that pod of type LoadBalancer. also the ports should be mapped. regarding the database, the logs explicitly are saying that a database with the hostname "db" is missing, but you do not have any Pod setup in that deployment with that name. – Vitor Paulino Jan 07 '22 at 08:50

1 Answers1

2

As noted in the DockerHub page for your image, you need to provide database environmental variables to connect to a seperate MySQL db instance. It's a bit more complicated in Kubernetes, but you can use something like this instead:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: eramba
  namespace: eramba-1
  labels:
   app: eramba               
spec:
  replicas: 1
  selector:
    matchLabels:
      app: eramba
  template:
    metadata:
      labels:
        app: eramba
    spec:
      containers:
      - name: eramba
        image: docker.io/digitorus/eramba:latest
        ports:
        - containerPort: 80
        env:
        - name: DB_ENV_MYSQL_DATABASE
          value: "eramba-db"
        - name: DB_ENV_MYSQL_USER
          value: "eramba"
        - name: DB_ENV_MYSQL_PASSWORD
          value: "password"
        - name: DB_ENV_MYSQL_ROOT_PASSWORD
          value: "password"
        - name: ERAMBA_HOSTNAME
          value: localhost
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: eramba-db
  namespace: eramba-1
  labels:
   app: eramba-db
spec:
  replicas: 1
  selector:
    matchLabels:
      app: eramba-db
  template:
    metadata:
      labels:
        app: eramba-db
    spec:
      containers:
      - name: eramba-db
        image: docker.io/digitorus/eramba-db:latest
        ports:
        - containerPort: 3306
        env:
        - name: MYSQL_DATABASE
          value: "eramba-db"
        - name: MYSQL_USER
          value: "eramba"
        - name: MYSQL_PASSWORD
          value: "password"
        - name: MYSQL_ROOT_PASSWORD
          value: "password"
---
apiVersion: v1
kind: Service
metadata:
  name: db
  namespace: eramba-1
spec:
  selector:
    app: eramba-db
  ports:
    - name: sql
      port: 3306
      targetPort: 3306
---
apiVersion: v1
kind: Service
metadata:
  name: eramba-np
  namespace: eramba-1
spec:
  type: NodePort
  selector:
    app: eramba
  ports:
    - name: http
      port: 80
      targetPort: 80
      nodePort: 30045

This creates the app and database with the required environmental variables according to the configuration files in the repositories. It then creates a ClusterIP service "db" in the same namespace (so the DNS name from the app to hit that is "db") which connects to port 3306 on the database.

The app will take a minute to start while it waits on the database to initialise. I've added a NodePort service on port 30045. You should be able to reach the UI on http://localhost:30045 and login with admin/admin.

I would recommend learning a bit more about how containers run and communicate with eachother using Docker and Docker Compose, as Kubernetes is difficult to jump right into without that prior knowledge. For example here, the database and the app are run in seperate containers (eramba, eramba-db) and use a service to connect with each other. You were running just the app with no database and no configuration.

clarj
  • 1,001
  • 4
  • 14
  • Hey so the yaml file outputs . I figured the issue was with the hardcoded host in the repo, so I changed all instances of to $DB_ENV_MYSQL_HOST and defined that in the yaml file. It's still giving me . Any ideas? Thanks, – Bryan Jan 07 '22 at 21:06
  • I've raised it as a separate issue - https://stackoverflow.com/questions/70626198/multi-container-docker-app-in-kubernetes-fail-database-connection-unknown-hos?noredirect=1#comment124851424_70626198 – Bryan Jan 07 '22 at 21:09