Sling Academy
Home/DevOps/How to Enable SASL Authentication in Kafka

How to Enable SASL Authentication in Kafka

Last updated: January 30, 2024

Overview

Apache Kafka is a distributed streaming platform used widely across industries for building real-time data pipelines and streaming applications. Given the importance of data security, Kafka supports various authentication mechanisms, and one of those is SASL (Simple Authentication and Security Layer). Enabling SASL authentication adds a layer of security, ensuring that only authenticated users or systems can publish or consume messages. In this tutorial, we will walk through how to enable and configure SASL authentication in Kafka with step-by-step examples.

Understanding Kafka SASL Authentication

SASL is a framework that provides authentication and optional encryption over network protocols. In Kafka, SASL support is flexible, with multiple mechanisms such as GSSAPI (Kerberos), PLAIN, SCRAM, and more recently OAuth. For this tutorial, we will focus on setting up the SASL/PLAIN mechanism, which is straightforward and commonly used for internal authentication purposes.

Prerequisites

  • Apache Kafka installation
  • Java Runtime Environment (JRE) or Java Development Kit (JDK)
  • Basic understanding of Kafka’s architecture and operation

Step-by-Step Instructions

Step 1: Configure Kafka for SASL/PLAIN

First, you need to configure the Kafka broker to use SASL/PLAIN. Modify your Kafka broker’s configuration file (server.properties) to include the following lines:

listeners=SASL_PLAINTEXT://:9092
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=PLAIN

The above settings enable SASL on the listener and choose PLAIN as the mechanism. Note that we are using ‘SASL_PLAINTEXT’ for the listener which does not include any encryption. For production systems, it is recommended to use ‘SASL_SSL’ for encrypted communication.

Step 2: Create a JAAS Configuration File

You will need a JAAS (Java Authentication and Authorization Service) configuration file to specify the credentials used by clients and brokers. Here is a simple example of what that file might look like:

KafkaServer {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="admin"
  password="admin-secret"
  user_admin="admin-secret"
  user_alice="alice-secret";
};

KafkaClient {
  org.apache.kafka.common.security.plain.PlainLoginModule required
  username="alice"
  password="alice-secret";
};

Save this content as kafka_server_jaas.conf and reference it in your Kafka start-up by setting the KAFKA_OPTS environment variable:

export KAFKA_OPTS="-Djava.security.auth.login.config=/path/to/your/kafka_server_jaas.conf"

Every user mentioned in this file can authenticate using the designated password. The same file is also used to authenticate inter-broker communication with the admin user’s credentials.

Step 3: Start Kafka with SASL Authentication

With your settings updated and the JAAS file in place, you can now start the Kafka server with SASL Authentication enabled. Make sure that your KAFKA_OPTS is set properly as an environment variable with the correct path to the JAAS configuration file, then start Kafka normally using the Kafka start-up script.

bin/kafka-server-start.sh config/server.properties

If everything is set up right, your Kafka server should boot up and be secured with SASL/PLAIN authentication.

Step 4: Configure Kafka Clients for SASL

Just as the brokers need to be configured for SASL, your clients (producers and consumers) also need to be set up. The producer and consumer configuration must specify the SASL mechanism and, optionally, a JAAS configuration. Here’s an example of how to configure a Kafka client in Java:

Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("security.protocol", "SASL_PLAINTEXT");
props.put("sasl.mechanism", "PLAIN");

// Client JAAS login context must match the one in the server configuration
props.put("sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"alice\" password=\"alice-secret\";");

// Create a producer
KafkaProducer<String, String> producer = new KafkaProducer<>(props);

It’s important to ensure that the username and password are consistent with the credentials set in the JAAS configuration used by the Kafka brokers.

Testing the Configuration

To verify that SASL authentication is working, you can run simple producer and consumer clients and observe that they are able to exchange messages. Ensure that there are no security-related errors in the broker or client logs, which could indicate misconfiguration.

./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test --producer.config client-sasl.properties
./bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning --consumer.config client-sasl.properties

If either client isn’t properly authenticated, the operation will fail, and the broker will log an error with details that can help you diagnose the problem.

Advanced Configuration

Once you have basic SASL authentication working, you may want to explore more sophisticated setups. For example, you might want to:

  • Use SASL/SSL for encrypted client-broker communication.
  • Integrate with a real authentication server for user management, such as an LDAP server.
  • Use an authorization protocol like Kafka’s ACL for more fine-grained access control on topics.

Each of these steps involves expanding your Kafka and client configuration to support additional features and improve overall system security.

Conclusion

Enabling SASL authentication in Kafka is an essential step in securing your message streaming platform. By following the steps covered in this tutorial, you can ensure that only authenticated users and services can access your Kafka ecosystem. Remember that the SASL/PLAIN mechanism should ideally be combined with TLS/SSL for encrypted communication in a production environment.

Next Article: How to Tune Kafka for High Performance

Previous Article: How to Configure SSL/TLS in Kafka

Series: Apache Kafka Tutorials

DevOps

You May Also Like

  • How to reset Ubuntu to factory settings (4 approaches)
  • Making GET requests with cURL: A practical guide (with examples)
  • Git: What is .DS_Store and should you ignore it?
  • NGINX underscores_in_headers: Explained with examples
  • How to use Jenkins CI with private GitHub repositories
  • Terraform: Understanding State and State Files (with Examples)
  • SHA1, SHA256, and SHA512 in Terraform: A Practical Guide
  • CSRF Protection in Jenkins: An In-depth Guide (with examples)
  • Terraform: How to Merge 2 Maps
  • Terraform: How to extract filename/extension from a path
  • JSON encoding/decoding in Terraform: Explained with examples
  • Sorting Lists in Terraform: A Practical Guide
  • Terraform: How to trigger a Lambda function on resource creation
  • How to use Terraform templates
  • Understanding terraform_remote_state data source: Explained with examples
  • Jenkins Authorization: A Practical Guide (with examples)
  • Solving Jenkins Pipeline NotSerializableException: groovy.json.internal.LazyMap
  • Understanding Artifacts in Jenkins: A Practical Guide (with examples)
  • Using Jenkins with AWS EC2 and S3: A Practical Guide