0

I have a situation where I have two docker compose files: one contains, among other things, a Kafka image; the other contains some other process which consumes from Kafka.

kakfa.yaml:

version: "3.4" 
services:   
  zookeeper:
    image: "wurstmeister/zookeeper:latest"
    hostname: zookeeper
    ports:
      - "2181:2181"
    environment:
      - ALLOW_ANONYMOUS_LOGIN=yes
    restart: on-failure

  kafka:
    image: "wurstmeister/kafka:latest"
    hostname: kafka
    ports:
      - "9092:9092"
      - "9093:9093"
      - "19092:19092"
    environment:
      KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"
      KAFKA_ADVERTISED_PORT: "9092"
      KAFKA_LISTENERS: "INTERNAL://kafka:9092,EXTERNAL://0.0.0.0:9093"
      KAFKA_ADVERTISED_LISTENERS: "INTERNAL://kafka:9092,EXTERNAL://localhost:9093"
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: "INTERNAL:PLAINTEXT,EXTERNAL:PLAINTEXT"
      KAFKA_INTER_BROKER_LISTENER_NAME: "INTERNAL"
    depends_on:
      - zookeeper
    restart: on-failure

processor.yaml

version: "3.4"
services:
  processor:
    build: .
    environment:
      - LOGURU_LEVEL=DEBUG
      - MQ_BOOTSTRAP_SERVER=kafka
      - MQ_BOOTSTRAP_PORT=9092
    ports:
      - "8801:80"
    restart: on-failure
    volumes:
      - ./service:/app/service
    entrypoint: [ '/bin/sh', '-c' ]
    command: |
      "
      kafka-console-consumer --bootstrap-server localhost:9092 --topic example_topic
      "

My command attempt at the bottom of processor.yaml gives the following error in docker: /bin/sh: 2: kafka-console-consumer: not found.

Is there a way I can set Consumer configs within processor.yaml?

Edit: my dockerfile for both yaml files

ENV APP_PATH=/app
ENV PYTHONPATH="${PYTHONPATH}:${APP_PATH}"

COPY requirements.txt "${APP_PATH}/requirements.txt"
RUN pip install --no-cache-dir  -r "${APP_PATH}/requirements.txt"

COPY . $APP_PATH
WORKDIR $APP_PATH

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • 1
    Without seeing your Dockerfile is a bit hard but I would try with the full path to `kafka-console-consumer` and also make sure that `processor` runs in the same docker network so you can use `kafka:9092` instead of `localhost:9092` (it will be way easier than trying to reach kafka from a different docker network or through your host network) – Gerard Garcia Aug 02 '22 at 14:35
  • The full path I have for kafka (shown on docker desktop) is `/usr/local/openjdk-11/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/kafka/bin` but this seems to give the same error. What might I be doing wrong? – James Ellis Aug 02 '22 at 14:51
  • can you paste your processor `Dockerfile` in the question? – Gerard Garcia Aug 02 '22 at 14:56
  • Of course! Both have the same docker file. – James Ellis Aug 02 '22 at 15:07

1 Answers1

0

Your error is not related to "consumer configs". Your Dockerfile appears to be a Python container. You have not installed the Kafka CLI tools there such that kafka-console-consumer.sh command would be available.

To have the Kafka CLI tools, you'd also need to install Java in that container, which would cause your image to be larger than necessary. Instead, you should write a consumer in a Python client, then run your own python consumer.py as your container command.

Also, localhost:9092 is not referring to the Kafka container. Related - Connect to Kafka running in Docker

Otherwise, you don't need a different container to run kafka-console-consumer; you can docker-compose exec kafka bash -c "kafka-console-consumer --bootstrap-server localhost:9093 ... "

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245