How to use Kafka with Docker and Docker Compose

Updated: February 3, 2024 By: Guest Contributor One comment

Introduction

Apache Kafka is a distributed streaming platform that allows for high-throughput, fault-tolerant messaging between servers and applications. With the rise of containerization, running Kafka with Docker and Docker Compose has become a popular and efficient method for deploying Kafka clusters. In this tutorial, we will go step-by-step to set up Kafka alongside Zookeeper using Docker and Docker Compose, and then we’ll explore how to interact with your Kafka cluster.

Prerequisites

  • Docker installed on your machine.
  • Docker Compose installed on your machine.
  • Basic understanding of Kafka concepts.
  • Basic familiarity with Docker and Docker Compose.

Setting Up Kafka and Zookeeper with Docker Compose

To get started, you’ll need to define the services that will run your Kafka cluster and Zookeeper in a Docker Compose file. Below is a simple Compose file configuration to begin with:

version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - '2181:2181'

  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - '9092:9092'
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

Save this file as docker-compose.yml, and then you can start your Kafka and Zookeeper containers by running:

docker-compose up -d

Creating a Kafka Topic

Once your containers are up, you can create a Kafka topic. Kafka comes with a set of command-line tools that you can use. Execute the following command to create a topic named ‘test-topic’:

docker-compose exec kafka kafka-topics --create --topic test-topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

You can also try this command:

docker-compose exec kafka kafka-topics --create --topic test-topic --partitions 1 --replication-factor 1 --if-not-exists --bootstrap-server localhost:9092

Producing Messages to a Kafka Topic

You can publish messages to a topic using the Kafka console producer. Run the command below to send a message ‘Hello, Kafka!’ to our ‘test-topic’.

docker-compose exec kafka kafka-console-producer --topic test-topic --bootstrap-server localhost:9092
> Hello, Kafka!

Consuming Messages from a Kafka Topic

To consume messages from the topic, you can use the Kafka console consumer. Run the following command:

docker-compose exec kafka kafka-console-consumer --topic test-topic --from-beginning --bootstrap-server localhost:9092

This command will output the messages sent to ‘test-topic’, including the message we sent earlier:

Hello, Kafka!

Advanced Configurations

For more complex requirements, you can customize your Docker Compose setup with additional configurations such as:

  • Setting up multiple Kafka brokers.
  • Configuring Kafka for SSL/TLS.
  • Adding environment variables for better performance tuning.

Scaling Up Kafka Brokers

To scale your Kafka cluster, you can modify the Docker Compose file to include more brokers and update the ‘KAFKA_ADVERTISED_LISTENERS’ configuration:

services:
  kafka:
    image: confluentinc/cp-kafka:latest
    ......
    ......
  kafka2:
    image: confluentinc/cp-kafka:latest
    ...... # Rest of the configurations similar to ‘kafka’

Enabling SSL/TLS

You would have to create certificates, configure Kafka to use SSL/TLS for its listeners, and then map your SSL directory to the broker:

services:
  kafka:
    image: confluentinc/cp-kafka:latest
    volumes:
      - ./ssl:/etc/kafka/ssl
    environment:
      ... # Include SSL configurations

Managing Kafka Topics and Configurations

Kafka provides scripts to alter topics and change their configurations, which can also be executed through containers:

docker-compose exec kafka kafka-topics --alter --topic test-topic --partitions 3 --zookeeper zookeeper:2181

Monitoring and Logging

Docker Compose can facilitate monitoring and logging by integrating with tools like Prometheus and ELK stack. You can also view the containers’ logs with:

docker-compose logs kafka

Conclusion

In this tutorial, we’ve explored the basics of running Kafka within Docker containers managed by Docker Compose. Starting from a minimal setup, we expanded into more advanced concepts, enabling a robust development and testing environment for Kafka applications. Remember that moving to production will require more extensive configuration, especially for durability and security considerations.

1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments