0

I'm trying to setup Jenkins slave using Kubernetes using kubernetes plugin. My goal is to have the plugin spin up a GKE container each time there is a job need to run.

I have done the below setup:

1 - create new cluster

2 - create service account with admin role for the cluster

3 - config jenkins kubernetes plugin (able to connect to GKE cluster)

4 - config pod template (using my custom Docker image which is already pushed to DockerHub)

When I build a project I can see the job running halfway until get to the point below:

Agent went offline during the build
ERROR: Connection was broken: java.nio.channels.ClosedChannelException

What is this error and how can I fix it? Is there any other setup needed?

Updated: after searching, I updated my Dockerfile. Seems like the container can write already:

FROM jenkins/inbound-agent
USER root

# Install Build Essentials
RUN apt-get update \
    && apt-get upgrade -y \
    && apt-get dist-upgrade -y \
    && apt-get install build-essential -y
# Set Environment Variables
ENV SDK_URL="https://dl.google.com/android/repository/sdk-tools-linux-4333796.zip" \
    ANDROID_HOME="/home/jenkins/android-sdk" \
    ANDROID_VERSION=30 \
    BUILDTOOL_VERSION="30.0.2" \
    NDK_VERSION="22.0.7026061"

USER jenkins

# Download Android SDK
RUN mkdir "$ANDROID_HOME" .android \
    && cd "$ANDROID_HOME" \
    && curl -o sdk.zip $SDK_URL \
    && unzip sdk.zip \
    && rm sdk.zip \
    && mkdir "$ANDROID_HOME/licenses" || true \
    && echo "24333f8a63b6825ea9c5514f83c2829b004d1fee" > "$ANDROID_HOME/licenses/android-sdk-license" \
    && yes | $ANDROID_HOME/tools/bin/sdkmanager --licenses

# Install Android Build Tool and Libraries
RUN $ANDROID_HOME/tools/bin/sdkmanager --update
RUN $ANDROID_HOME/tools/bin/sdkmanager "build-tools;${BUILDTOOL_VERSION}" \
    "platforms;android-${ANDROID_VERSION}" \
    "platform-tools" \
    "ndk;${NDK_VERSION}"

I can see the Workload created and build running but don't see why it always stop at the middle with the error above. It seems my pod is disconnected before build finish.

halfer
  • 19,824
  • 17
  • 99
  • 186
Lê Khánh Vinh
  • 2,591
  • 5
  • 31
  • 77
  • Could you share your pod template ? Please also run `id` command in the container created from your custom docker image ? What's the result ? – mario Jan 14 '21 at 16:59
  • hi i have updated the question. my container is created dynamically using gke (when there is a job, a new container is created) (can see different name in workload menu) – Lê Khánh Vinh Jan 14 '21 at 17:16
  • ok, actually I wanted to point out that you may need to configure `securityContext` but I see you've already done it. Did you try first to run this custom image locally on plain docker ? – mario Jan 16 '21 at 11:46
  • i updated my `Dockerfile` and can run the container but seem it always stop before completing the build. Updated my question – Lê Khánh Vinh Jan 18 '21 at 17:12
  • ok, so it seems like the initial problem with writing permissions has been solved by escalating privileges using `securityContext`. Unfortunatelly I'm not able to tell you what to do next as it seems like the problem with the imige itself. As far as I understand you, you observe the same behaviour also when you run container from this image on plain docker locally? According to [this answer](https://stackoverflow.com/a/45538297/11714114) updating Java from 7 to 8 may help to resolve this issue. Especially _"Check the java version being used by your master, and the java version of your slave."_ – mario Jan 30 '21 at 14:20
  • Apart from updating Java on **master** and **slave**, it may be also related with lack of the resources, allocated for your pods. Take a look at the solution presented [here](https://medium.com/@garunski/closedchannelexception-in-jenkins-with-kubernetes-plugin-a7788f1c62a9). – mario Jan 30 '21 at 14:24

0 Answers0