0

I have a problem with write a dockerfile for my application. My code is beloy:

# define a imagem base
FROM ubuntu:latest

# define a owner image
LABEL maintainer="MyCompany"

# Update a image with packages
RUN apt-get update && apt-get upgrade -y

# Expose port 80
EXPOSE 8089

# Command to start my docker compose file
CMD ["docker-compose -f compose.yaml up -d"]

# Command to link KafkaConnect with MySql (images in docker compose file)
CMD ["curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" 
localhost:8083/connectors/ -d "
{ \"name\": \"inventory-connector\",
      \"config\": { 
            \"connector.class\": \"io.debezium.connector.mysql.MySqlConnector\",
            \"tasks.max\": \"1\", 
            \"database.hostname\": \"mysql\",
            \"database.port\": \"3306\",
            \"database.user\": \"debezium\",
            \"database.password\": \"dbz\",
            \"database.server.id\": \"184054\",
            \"database.server.name\": \"dbserver1\",
            \"database.include.list\": \"inventory\",
            \"database.history.kafka.bootstrap.servers\": \"kafka:9092\",
            \"database.history.kafka.topic\": \"dbhistory.inventory\"
      }
}"]

I know there can only be one CMD inside the dockerfile file. How do I run my compose file and then make a cURL call?

helvete
  • 2,455
  • 13
  • 33
  • 37
  • It's a little bit unusual to run `docker-compose` as the main command in a container; usually you'd just run it on the host to start your other containers. You could run the `curl` command as part of the startup sequence in one of the other containers, or in principle you could include it as a "service" that Compose launches. – David Maze Jul 15 '21 at 04:21
  • #1 Did you solve your problem? #2 Do you need to set the connection values of one container in your docker-compose.yml ? – JRichardsz Jul 16 '21 at 15:04

2 Answers2

0

You need to use RUN command for this. Check this answer for the difference between RUN and CMD.

If your second CMD is the final command inside the Dockerfile, then just change the following line:

# Command to start my docker compose file
RUN docker-compose -f compose.yaml up -d

If you have more commands to run after the CMDs you have now, try the below:

# Command to start my docker compose file
RUN docker-compose -f compose.yaml up -d

# Command to link KafkaConnect with MySql (images in docker compose file)
RUN curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" 
localhost:8083/connectors/ -d "
{ \"name\": \"inventory-connector\",
      \"config\": { 
            \"connector.class\": \"io.debezium.connector.mysql.MySqlConnector\",
            \"tasks.max\": \"1\", 
            \"database.hostname\": \"mysql\",
            \"database.port\": \"3306\",
            \"database.user\": \"debezium\",
            \"database.password\": \"dbz\",
            \"database.server.id\": \"184054\",
            \"database.server.name\": \"dbserver1\",
            \"database.include.list\": \"inventory\",
            \"database.history.kafka.bootstrap.servers\": \"kafka:9092\",
            \"database.history.kafka.topic\": \"dbhistory.inventory\"
      }
}"

# To set your ENTRYPOINT at the end of the file, uncomment the following line
# ENTRYPOINT ["some-other-command-you-need", "arg1", "arg2"]
Eranga Heshan
  • 5,133
  • 4
  • 25
  • 48
0

Here's another option - run the curl from within the Kafka Connect container that you're creating. It looks something like this:

  kafka-connect:
    image: confluentinc/cp-kafka-connect-base:6.2.0
    container_name: kafka-connect
    depends_on:
      - broker
    ports:
      - 8083:8083
    environment:
      CONNECT_BOOTSTRAP_SERVERS: "kafka:9092"
      CONNECT_REST_ADVERTISED_HOST_NAME: "kafka-connect"
      CONNECT_REST_PORT: 8083
      CONNECT_GROUP_ID: kafka-connect
      CONNECT_CONFIG_STORAGE_TOPIC: _kafka-connect-configs
      CONNECT_OFFSET_STORAGE_TOPIC: _kafka-connect-offsets
      CONNECT_STATUS_STORAGE_TOPIC: _kafka-connect-status
      CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'
      CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
      CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'
      CONNECT_LOG4J_ROOT_LOGLEVEL: "INFO"
      CONNECT_LOG4J_LOGGERS: "org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR"
      CONNECT_LOG4J_APPENDER_STDOUT_LAYOUT_CONVERSIONPATTERN: "[%d] %p %X{connector.context}%m (%c:%L)%n"
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: "1"
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: "1"
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: "1"
    command: 
      - bash 
      - -c 
      - |
        #
        echo "Installing connector plugins"
        confluent-hub install --no-prompt debezium/debezium-connector-mysql:1.5.0
        #
        echo "Launching Kafka Connect worker"
        /etc/confluent/docker/run & 
        #
        echo "Waiting for Kafka Connect to start listening on localhost ⏳"
        while : ; do
          curl_status=$$(curl -s -o /dev/null -w %{http_code} http://localhost:8083/connectors)
          echo -e $$(date) " Kafka Connect listener HTTP state: " $$curl_status " (waiting for 200)"
          if [ $$curl_status -eq 200 ] ; then
            break
          fi
          sleep 5 
        done
        echo -e "\n--\n+> Creating Data Generator source"
        curl -s -X PUT -H  "Content-Type:application/json" http://localhost:8083/connectors/inventory-connector/config \
            -d '{ 
                "connector.class": "io.debezium.connector.mysql.MySqlConnector",
                "tasks.max": "1", 
                "database.hostname": "mysql",
                "database.port": "3306",
                "database.user": "debezium",
                "database.password": "dbz",
                "database.server.id": "184054",
                "database.server.name": "dbserver1",
                "database.include.list": "inventory",
                "database.history.kafka.bootstrap.servers": "kafka:9092",
                "database.history.kafka.topic": "dbhistory.inventory"
        }'
        sleep infinity

You can see the full Docker Compose here

Robin Moffatt
  • 30,382
  • 3
  • 65
  • 92