Apache Nifi is a popular open source visual ETL (extract, transform, load) tool which can be used to consume or publish event data to and from many destinations, including Apache Kafka.
Once ingested, Apache Nifi can be used to route, filter, enrich, and transform the payload. Apache Kafka is a distributed publish-subscribe messaging system that serves multiple use cases because of its high throughput, replication, and fault tolerance. It is a very reliable way of getting streaming event data into and out of your Nifi Flows.
IBM Event Streams is a secure, fully managed Kafka-as-a-Service offering. It uses SASL_SSL to securely send and receive data, however this makes it less straightforward to connect your NiFi Kafka processors to than an unsecured Kafka.
This guide can be used to help with connecting Apache Nifi to a hosted Apache Kafka offering, such as IBM Event Streams. It assumes that you have already setup an Apache Nifi Cluster or instance and provisioned IBM Event Streams.
1. Create IBM Event Streams credentials
With an instance of IBM Event Streams created, you will need to create connection credentials either using the IBM Cloud console UI or IBM Cloud CLI. This will provide the user, password, and the list of Kafka brokers which are used to connect to IBM Event Streams. To create credentials from the IBM Cloud Console, navigate to the Service Details page of your Event Streams instance. This is the page that you are redirected to upon creating the instance:
2. Add certificates to the Apache NiFi trust store
To connect to Kafka over SASL_SSL, Apache NiFi requires that a Truststore be created and the IBM Event Stream certificates be included. The Truststore is used by NiFi to verify the the ssl connection. While not required to communicate with IBM Event Streams, it is good to create a Keystore to secure Apache NiFi and enable authentication of users to the UI. This is also required to enable communication between nodes of an Apache NiFi cluster.
Apache NiFi includes the TLS Toolkit utility to help with generating the Keystore and Truststore. The following command can be used for a standalone server with hostnames *.nifi.svc.cluster.local and your desired Truststore and Keystore passwords.
The openssl and keytool command-line utility can be used to download the Event Streams certificate and then import it into the truststore.jk. The hostname for Event Streams can be obtained in Step 1 when creating the credentials. The <<nifi service – namespace>> is what was specified when deploying Apache NiFi. Be sure to use your Keystore password that was used in the tls-toolkit
command to setup your stores.
For Deployment, a custom Docker image can be created from the base NiFi Docker image with the configured Keystore and Truststore.
3. Configure an Apache NiFi Kafka consumer or producer
Apache NiFi should now have what it needs to connect to IBM Event Streams. To start consuming or publishing events, add a ConsumeKafkaRecord or PublishKafkaRecord NiFi processor and change the following configurations.
- Enter the comma separated list of
Kafka Brokers
from Step 1. - Enter your
Topic Name(s)
for the topics that you want to consume from or publish to. - Select and configure a
Record Reader
andRecord Writer
. - Choose
SASL_SSL
as theSecurity Protocol
. - Enter the
Username
andPassword
retrieved from Step 1. - Choose a
Group ID
. - Create a new
SSL Context Service
.
Once complete, click the arrow next to your SSL Context Service to go to the controllers page and start to configure your SSL Context Service.
4. Configure the connected SSLContextService
The SSL Context Service is what connects your Truststore and Keystore to Apache NiFi. Enter in the configurations to specify where the Keystore and Truststore filenames are located as well as their corresponding passwords. The image below shows example configurations to configure the SSLContextService:
Once complete, enable your SSL Context Service and the Kafka Nifi Processor, and you should now be able to connect to Event Streams and publish or consume event data in Apache Nifi.
Next steps
With Apache Nifi connected to IBM Event Streams, you can now concentrate on creating ETL processes for your Event Streams data. Once that is complete, you might consider storing your results in one of the many managed databases in IBM Cloud so it can be consumed by your application. Alternatively, you might consider sending your data to IBM streams or a cloud function to create a machine learning pipeline or other more computationally intensive processing.
The choice is yours on what you want to do but you can be mindful of the fact that you can spend more time on creating your application and less time managing Kafka or writing ETL processes while being able to leverage many of the services within IBM Cloud.