How to Connect to Kafka Running in Docker From Outside Docker

Updated: January 31, 2024 By: Guest Contributor Post a comment

Introduction

In the world of streaming data and microservices, Apache Kafka is a household name, known for its high throughput and scalability. Running Kafka in Docker containers eases development and deployment, but connecting to it from outside the Docker environment can be challenging, especially for developers new to Docker or Kafka. This tutorial will walk you through the process, ensuring a seamless connection from your applications to a Kafka cluster running inside Docker.

Prerequisites

  • Docker and Docker Compose installed on the host machine.
  • Basic understanding of Kafka and Docker networking.
  • An existing Docker network or knowledge of how to create one if needed.

Understanding Kafka and Docker Networking

Before we proceed, it’s essential to understand that Docker containers run in isolated networks. This setup can complicate accessing a Kafka broker from outside Docker since the default Kafka listener binds to the container’s network interface. You, therefore, need to configure Kafka to bind to an interface accessible by the host machine.

Step 1: Configuring Kafka Listeners and Advertised Listeners

Listeners are the network addresses the Kafka broker binds to. Advertised listeners are addresses clients use to establish a connection to the broker. To connect from outside Docker, you need to set these properly in the Kafka configuration.

Example:

KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092,PLAINTEXT_HOST://0.0.0.0:29092
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

Explanation:

  • PLAINTEXT_HOST defines a listener for external connectivity.
  • The listener PLAINTEXT_HOST listens on all network interfaces of the host machine.
  • The broker is advertised as localhost on port 29092 to the host machine.
  • Inter-broker communications continue to use the default PLAINTEXT listener.

Step 2: Configuring Docker

Once Kafka is correctly configured, update your Docker configuration to angle use this setup. If using Docker Compose, update the 'ports' and 'networks' sections for your Kafka service.

Example:

services:
 kafka:
 image: wurstmeister/kafka
 ports:
 - "29092:29092"
 networks:
 - kafka-net
 environment:
 - ... [OTHER KAFKA ENVIRONMENT VARIABLES] ...

Explanation:

  • We bind the external listener port 29092 to the same port on the host side.
  • The container joins a network named kafka-net, which should be defined elsewhere in your Compose file.

Step 3: Connecting To Kafka From Outside Docker

To connect an external application to this Kafka broker, point your client configurations to localhost:29092.

Example:

properties.put("bootstrap.servers", "localhost:29092");
properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
properties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

Your application now successfully connects to Kafka running in a Docker container from outside the Docker environment.

Additional Considerations

  • If running multiple Kafka brokers, ensure that each has a unique port binding in both the broker configuration and the Docker configuration.
  • Be aware of Docker networking modes and how they might affect connectivity.
  • For production environments, consider securing your Kafka listeners using SASL/SSL.

Conclusion

By understanding Kafka’s listener configurations and Docker networking, you can facilitate external applications to integrate with Kafka running within Docker. This allows for a flexible development environment and aligns with modern application deployment practices.

With this guide, you should now have a working Kafka setup in Docker accessible from external applications. Developing microservices and data streaming applications inevitably requires a thorough grip on such connectivity nuances to ensure high availability and system resilience. Master these crucial steps, and your development journey with Kafka in Docker will become significantly smoother.