Securely connecting to Kafka through WSO2 Streaming Integrator

Shenavi de Mel
6 min readJun 30, 2021

Overview

With the increasing popularity of the usage of event streaming in many organisations, usecases around using the Kafka platform is gaining immense significance. Therefore when using Kafka which is an open source event streaming platform at a commercial level, it is essential to manage the secure communication between the clients and the Kafka Platform.

In this blog I will walk you through two ways in which you can secure Kafka and how you publish to or consume from Kafka topics using the WSO2 Streaming Integrator(SI) which is a streaming data processing server that integrates streaming data and takes action based on streaming data, in a secure manner.

Let’s Get Started

Before proceeding with this blog, It is advisable to follow this tutorial in order to get a better understanding of how Streaming Integrator works together with Kafka.

By default Kafka is installed without any authentication. However Kafka supports several authentication protocols to establish a more secure connection with the brokers. Below are some examples of supported authentication mechanisms. You can further read on them here.

  1. Authentication using SSL

2. Authentication with SASL using JAAS[2] (Simple authentication and security layer)

  • PlainText — Username and Password authentication
  • Plain — Username and Password authentication which is encrypted.
  • GSSAPI(Kerberos) — For organisations using Kerberos
  • LDAP

Authentication using SSL

I will first walk you through how to securely communicate with Kafka from the Streaming Integrator which is secured through SSL.

A step by step guide on how to setup both Kafka with SSL and the Streaming Integrator to test this sample can be found here.

Method

You can follow this document on configuring and enabling SSL authentication for the Kafka broker. Here you need to create a truststore and keystore for the kafka broker.

Once you have set this up the Kafka_Home/config/server.properties file will have a configuration block similar to the below.

server.properties file configuration for enabling authentication using SSL on Kafka

Here the WSO2 Streaming integrator will act as the client. Therefore we will need to generate a client truststore and client keystore as well.

Once they are generated we need to exchange the certificates such that the secured connection can be successfully established during the SSL handshake

In Kafka there are two ways in which you can enable the Kafka broker to trust the connecting clients.

Option 1 — You need to export the Kafka broker’s server certificate’s public key and the client certificate’s public key and import it to eachother’s trust stores. For each new client connecting to the Kafka broker, you will need to update the truststore file of the Kafka server.

Option 2 — You can sign all the client certificates with the same CA certificate and add the root CA certificate to the Kafka truststore. With this option each time a new client is introduced you won’t need to update the truststore on the Kafka server.

I have used option 1 to configure the trusted connection. Therefore below are the additional steps which I performed.

  1. Imported the public certificate of the client keystore into the Kafka broker truststore.
  2. Imported the public certificate of the Kafka keystore into the client’s truststore.

Now Kafka is ready to establish a secure connections with clients. Now we are going to create a Siddhi application in the Streaming integrator to publish messages securely to a Kafka topic.

These are the configuration parameters which Kafka expects from the clients to establish a secure connection.

security.protocol=SSL
ssl.truststore.location=/home/shenavi/Documents/mac/PRE_SALES/Demos/SSL/keys/kafka.client.truststore.jks
ssl.truststore.password=Admin1234
ssl.keystore.location=/home/shenavi/Documents/mac/PRE_SALES/Demos/SSL/keys/kafka.client.keystore.jks
ssl.keystore.password=Admin1234
ssl.key.password=Admin1234

The configuration change which we need to do is, when defining the Kafka source we need to include the security information under the “optional.configuration” attribute. This is what will be included in the optional configuration.

optional.configuration=”””security.protocol:SSL,ssl.truststore.location:/Users/shenavidemel/Documents/SA/Demos/SSL/keys/kafka.client.truststore.jks,ssl.truststore.password:Admin1234,ssl.keystore.location:/Users/shenavidemel/Documents/SA/Demos//SSL/keys/kafka.client.keystore.jks,ssl.keystore.password:Admin1234,ssl.key.password:Admin1234"””

Below is what the Siddhi applications’s content would look like.

Siddhi application to communicate with Kafka over SSL. Please click on the image to expand.

You can deploy the above application in the Streaming integrator and consume messages from the productions Kafka topic in a secured manner over SSL.

Authentication using SASL PLAINTEXT

In this section we will be discussing how to authenticate using SASL Plaintext with the Kafka broker. SASL Plaintext is one of the simplest methods of configuring some level of authentication on the Kafka Server. In other words this can be referred to as username/password based authentication.

A step by step guide on how to setup both Kafka and the Streaming Integrator to test this sample can be found here.

Method

You can configure SASL PLAINTEXT for your Kafka broker by following the instructions in this document.

As part of the configuration, you need to create JAAS configs for the Kafka server and the Zookeeper server.

This is the Kafka Server’s JAAS config. Here the user we have defined is “admin” and the password is “12345”. You can define as many users as you like in the format : user_<username>=”<password>”

KafkaServer {
org.apache.kafka.common.security.plain.PlainLoginModule required
serviceName=”kafka”
username=”admin”
password=”12345"
user_admin=”12345";
};

Client {
org.apache.kafka.common.security.plain.PlainLoginModule required
username=”admin”
password=”12345";
};

Below is the JAAS configuration needed for the Zookeeper

Server {
org.apache.kafka.common.security.plain.PlainLoginModule required
username=”admin”
password=”12345"
user_admin=”12345";
};

Once we have created our JAAS configurations, we need to configure the Kakfa Server to work with SASL Plaintext authentication. Here is a portion form the Kafka Server’s server.properties file with the necessary changes.

server.properties file configuration for enabling authentication using SASL_PLAINTEXT on Kafka

Make sure to export the respective JAAS configurations as KAFKA_OPTS before starting the Kafka broker and Zookeeper instance. For example before starting the Kafka broker, make sure the JAAS config has been set as shown below.

eg: export KAFKA_OPTS=”-Djava.security.auth.login.config=/Users/shenavidemel/Demos/KAFKA/kafka_2.12–2.3.0/config/kafka_server_jaas.conf”

Now Kafka is ready to establish a secure connections with clients using the username and password. Now we can create a Siddhi application in the Streaming integrator to publish messages and consume messages securely with Kafka topics.

These are the configuration parameters which Kafka expects from clients

security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username=”admin” password=”12345"

Therefore the configuration change which we need to do is, when defining the Kafka source we need to include the above information under the “optional.configuration” attribute. This is what will be included in the optional configuration in this case.

optional.configuration=”””security.protocol:SASL_PLAINTEXT,sasl.kerberos.service.name:kafka,sasl.mechanism:PLAIN,sasl.jaas.config:org.apache.kafka.common.security.plain.PlainLoginModule required username=”admin” password=”12345";”””,

Below is what the Siddhi applications’s content would look like.

Siddhi application to communicate with Kafka over SASL PLAINTEXT. Please click on the image to expand.

You can deploy the above application in the Streaming integrator and consume messages from the sales topic and publish messages to the bulk-orders topic in a secured manner using username and password.

Similarly you can enable authentication for Kafka and connect to it securely using the WSO2 Streaming Integrator by passing the security configuration in the optional.configuration section inside Siddhi applications.

References

https://ei.docs.wso2.com/en/7.2.0/streaming-integrator/examples/working-with-kafka/

--

--

Shenavi de Mel

Lead Solutions Engineer at WSO2 | Loves coding | Loves writing