Sling Academy
Home/DevOps/How to use Kafka with Docker and Docker Compose

How to use Kafka with Docker and Docker Compose

Last updated: February 03, 2024

Introduction

Apache Kafka is a distributed streaming platform that allows for high-throughput, fault-tolerant messaging between servers and applications. With the rise of containerization, running Kafka with Docker and Docker Compose has become a popular and efficient method for deploying Kafka clusters. In this tutorial, we will go step-by-step to set up Kafka alongside Zookeeper using Docker and Docker Compose, and then we’ll explore how to interact with your Kafka cluster.

Prerequisites

  • Docker installed on your machine.
  • Docker Compose installed on your machine.
  • Basic understanding of Kafka concepts.
  • Basic familiarity with Docker and Docker Compose.

Setting Up Kafka and Zookeeper with Docker Compose

To get started, you’ll need to define the services that will run your Kafka cluster and Zookeeper in a Docker Compose file. Below is a simple Compose file configuration to begin with:

version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - '2181:2181'

  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - '9092:9092'
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

Save this file as docker-compose.yml, and then you can start your Kafka and Zookeeper containers by running:

docker-compose up -d

Creating a Kafka Topic

Once your containers are up, you can create a Kafka topic. Kafka comes with a set of command-line tools that you can use. Execute the following command to create a topic named ‘test-topic’:

docker-compose exec kafka kafka-topics --create --topic test-topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181

You can also try this command:

docker-compose exec kafka kafka-topics --create --topic test-topic --partitions 1 --replication-factor 1 --if-not-exists --bootstrap-server localhost:9092

Producing Messages to a Kafka Topic

You can publish messages to a topic using the Kafka console producer. Run the command below to send a message ‘Hello, Kafka!’ to our ‘test-topic’.

docker-compose exec kafka kafka-console-producer --topic test-topic --bootstrap-server localhost:9092
> Hello, Kafka!

Consuming Messages from a Kafka Topic

To consume messages from the topic, you can use the Kafka console consumer. Run the following command:

docker-compose exec kafka kafka-console-consumer --topic test-topic --from-beginning --bootstrap-server localhost:9092

This command will output the messages sent to ‘test-topic’, including the message we sent earlier:

Hello, Kafka!

Advanced Configurations

For more complex requirements, you can customize your Docker Compose setup with additional configurations such as:

  • Setting up multiple Kafka brokers.
  • Configuring Kafka for SSL/TLS.
  • Adding environment variables for better performance tuning.

Scaling Up Kafka Brokers

To scale your Kafka cluster, you can modify the Docker Compose file to include more brokers and update the ‘KAFKA_ADVERTISED_LISTENERS’ configuration:

services:
  kafka:
    image: confluentinc/cp-kafka:latest
    ......
    ......
  kafka2:
    image: confluentinc/cp-kafka:latest
    ...... # Rest of the configurations similar to ‘kafka’

Enabling SSL/TLS

You would have to create certificates, configure Kafka to use SSL/TLS for its listeners, and then map your SSL directory to the broker:

services:
  kafka:
    image: confluentinc/cp-kafka:latest
    volumes:
      - ./ssl:/etc/kafka/ssl
    environment:
      ... # Include SSL configurations

Managing Kafka Topics and Configurations

Kafka provides scripts to alter topics and change their configurations, which can also be executed through containers:

docker-compose exec kafka kafka-topics --alter --topic test-topic --partitions 3 --zookeeper zookeeper:2181

Monitoring and Logging

Docker Compose can facilitate monitoring and logging by integrating with tools like Prometheus and ELK stack. You can also view the containers’ logs with:

docker-compose logs kafka

Conclusion

In this tutorial, we’ve explored the basics of running Kafka within Docker containers managed by Docker Compose. Starting from a minimal setup, we expanded into more advanced concepts, enabling a robust development and testing environment for Kafka applications. Remember that moving to production will require more extensive configuration, especially for durability and security considerations.

Next Article: 5 ways to check Apache Kafka version

Previous Article: How to download and install Kafka on Ubuntu

Series: Apache Kafka Tutorials

DevOps

You May Also Like

  • How to reset Ubuntu to factory settings (4 approaches)
  • Making GET requests with cURL: A practical guide (with examples)
  • Git: What is .DS_Store and should you ignore it?
  • NGINX underscores_in_headers: Explained with examples
  • How to use Jenkins CI with private GitHub repositories
  • Terraform: Understanding State and State Files (with Examples)
  • SHA1, SHA256, and SHA512 in Terraform: A Practical Guide
  • CSRF Protection in Jenkins: An In-depth Guide (with examples)
  • Terraform: How to Merge 2 Maps
  • Terraform: How to extract filename/extension from a path
  • JSON encoding/decoding in Terraform: Explained with examples
  • Sorting Lists in Terraform: A Practical Guide
  • Terraform: How to trigger a Lambda function on resource creation
  • How to use Terraform templates
  • Understanding terraform_remote_state data source: Explained with examples
  • Jenkins Authorization: A Practical Guide (with examples)
  • Solving Jenkins Pipeline NotSerializableException: groovy.json.internal.LazyMap
  • Understanding Artifacts in Jenkins: A Practical Guide (with examples)
  • Using Jenkins with AWS EC2 and S3: A Practical Guide