Apache Nifi is a popular open source visual ETL (extract, transform, load) tool which can be used to consume or publish event data to and from many destinations, including Apache Kafka. 

Once ingested, Apache Nifi can be used to route, filter, enrich, and transform the payload. Apache Kafka is a distributed publish-subscribe messaging system that serves multiple use cases because of its high throughput, replication, and fault tolerance. It is a very reliable way of getting streaming event data into and out of your Nifi Flows. 

IBM Event Streams is a secure, fully managed Kafka-as-a-Service offering. It uses SASL_SSL to securely send and receive data, however this makes it less straightforward to connect your NiFi Kafka processors to than an unsecured Kafka. 

This guide can be used to help with connecting Apache Nifi to a hosted Apache Kafka offering, such as IBM Event Streams. It assumes that you have already setup an Apache Nifi Cluster or instance and provisioned IBM Event Streams.

1. Create IBM Event Streams credentials

With an instance of IBM Event Streams created, you will need to create connection credentials either using the IBM Cloud console UI or IBM Cloud CLI. This will provide the user, password, and the list of Kafka brokers which are used to connect to IBM Event Streams. To create credentials from the IBM Cloud Console, navigate to the Service Details page of your Event Streams instance. This is the page that you are redirected to upon creating the instance:

2. Add certificates to the Apache NiFi trust store

To connect to Kafka over SASL_SSL, Apache NiFi requires that a Truststore be created and the IBM Event Stream certificates be included. The Truststore is used by NiFi to verify the the ssl connection. While not required to communicate with IBM Event Streams, it is good to create a Keystore to secure Apache NiFi and enable authentication of users to the UI. This is also required to enable communication between nodes of an Apache NiFi cluster. 

Apache NiFi includes the TLS Toolkit utility to help with generating the Keystore and Truststore. The following command can be used for a standalone server with hostnames *.nifi.svc.cluster.local and your desired Truststore and Keystore passwords.

/opt/nifi/nifi-toolkit-current/bin/tls-toolkit.sh standalone -n *.nifi.svc.cluster.local -f /opt/nifi/nifi-current/conf/nifi.properties --trustStorePassword "securePasswordOne" --keyStorePassword "securePasswordTwo" || true && \

The openssl and keytool command-line utility can be used to download the Event Streams certificate and then import it into the truststore.jk. The hostname for Event Streams can be obtained in Step 1 when creating the credentials. The <<nifi service – namespace>> is what was specified when deploying Apache NiFi. Be sure to use your Keystore password that was used in the tls-toolkit command to setup your stores.

openssl s_client -connect broker-2-<<hostname>>.eventstreams.cloud.ibm.com:9093 -servername broker-2-<<hostname>>.eventstreams.cloud.ibm.com </dev/null \
        | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > broker-2.cert && \
    keytool -import -noprompt -trustcacerts \
        -alias kafka-broker -file broker-2.cert \
        -keystore /opt/nifi/nifi-current/\*.<<nifi service - namespace>>.svc.cluster.local/truststore.jks -storepass "securePasswordTwo"

For Deployment, a custom Docker image can be created from the base NiFi Docker image with the configured Keystore and Truststore. 

3. Configure an Apache NiFi Kafka consumer or producer

Apache NiFi should now have what it needs to connect to IBM Event Streams. To start consuming or publishing events, add a ConsumeKafkaRecord or PublishKafkaRecord NiFi processor and change the following configurations.

  1. Enter the comma separated list of Kafka Brokers from Step 1.
  2. Enter your Topic Name(s) for the topics that you want to consume from or publish to.
  3. Select and configure a Record Reader and Record Writer.
  4. Choose SASL_SSL as the Security Protocol.
  5. Enter the Username and Password retrieved from Step 1.
  6. Choose a Group ID.
  7. Create a new SSL Context Service.

Once complete, click the arrow next to your SSL Context Service to go to the controllers page and start to configure your SSL Context Service.

4. Configure the connected SSLContextService

The SSL Context Service is what connects your Truststore and Keystore to Apache NiFi. Enter in the configurations to specify where the Keystore and Truststore filenames are located as well as their corresponding passwords. The image below shows example configurations to configure the SSLContextService:

Once complete, enable your SSL Context Service and the Kafka Nifi Processor, and you should now be able to connect to Event Streams and publish or consume event data in Apache Nifi.

Next steps

With Apache Nifi connected to IBM Event Streams, you can now concentrate on creating ETL processes for your Event Streams data. Once that is complete, you might consider storing your results in one of the many managed databases in IBM Cloud so it can be consumed by your application. Alternatively, you might consider sending your data to IBM streams or a cloud function to create a machine learning pipeline or other more computationally intensive processing. 

The choice is yours on what you want to do but you can be mindful of the fact that you can spend more time on creating your application and less time managing Kafka or writing ETL processes while being able to leverage many of the services within IBM Cloud.

Was this article helpful?
YesNo

More from Cloud

Serverless vs. microservices: Which architecture is best for your business?

7 min read - When enterprises need to build an application, one of the most important decisions their leaders must make is what kind of software development to use. While there are many software architectures to choose from, serverless and microservices architectures are increasingly popular due to their scalability, flexibility and performance. Also, with spending on cloud services expected to double in the next four years, both serverless and microservices instances should grow rapidly since they are widely used in cloud computing environments. While…

Serverless use cases: How enterprises are using the technology to let developers innovate

6 min read - Serverless, or serverless computing, is an approach to software development that empowers developers to build and run application code without having to worry about maintenance tasks like installing software updates, security, monitoring and more. With the rise of cloud computing, serverless has become a popular tool for organizations looking to give developers more time to write and deploy code. Despite its name, a serverless framework doesn’t mean computing without servers. In a serverless architecture, a cloud service provider (CSP) handles…

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a fictional US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters