I have a situation where I have two docker compose files: one contains, among other things, a Kafka image; the other contains some other process which consumes from Kafka.
kakfa.yaml
:
version: "3.4"
services:
zookeeper:
image: "wurstmeister/zookeeper:latest"
hostname: zookeeper
ports:
- "2181:2181"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
restart: on-failure
kafka:
image: "wurstmeister/kafka:latest"
hostname: kafka
ports:
- "9092:9092"
- "9093:9093"
- "19092:19092"
environment:
KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"
KAFKA_ADVERTISED_PORT: "9092"
KAFKA_LISTENERS: "INTERNAL://kafka:9092,EXTERNAL://0.0.0.0:9093"
KAFKA_ADVERTISED_LISTENERS: "INTERNAL://kafka:9092,EXTERNAL://localhost:9093"
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: "INTERNAL:PLAINTEXT,EXTERNAL:PLAINTEXT"
KAFKA_INTER_BROKER_LISTENER_NAME: "INTERNAL"
depends_on:
- zookeeper
restart: on-failure
processor.yaml
version: "3.4"
services:
processor:
build: .
environment:
- LOGURU_LEVEL=DEBUG
- MQ_BOOTSTRAP_SERVER=kafka
- MQ_BOOTSTRAP_PORT=9092
ports:
- "8801:80"
restart: on-failure
volumes:
- ./service:/app/service
entrypoint: [ '/bin/sh', '-c' ]
command: |
"
kafka-console-consumer --bootstrap-server localhost:9092 --topic example_topic
"
My command
attempt at the bottom of processor.yaml
gives the following error in docker: /bin/sh: 2: kafka-console-consumer: not found
.
Is there a way I can set Consumer configs within processor.yaml
?
Edit: my dockerfile for both yaml files
ENV APP_PATH=/app
ENV PYTHONPATH="${PYTHONPATH}:${APP_PATH}"
COPY requirements.txt "${APP_PATH}/requirements.txt"
RUN pip install --no-cache-dir -r "${APP_PATH}/requirements.txt"
COPY . $APP_PATH
WORKDIR $APP_PATH