How-tos

Get Started with Streaming Analytics + Message Hub

Share this post:

Message Hub provides a simple communication mechanism built on Apache Kafka, enabling communication between loosely coupled Bluemix services. This article shows how to communicate with Message Hub from the Streaming Analytics Bluemix service using the messaging toolkit.

Setup

Creating a topic

In Message Hub, messages are transported through feeds called topics. Producers write messages to topics, and consumers read from topics.

The Message Hub Bluemix dashboard provides topic management tools. To create a topic click the plus sign, enter “sampleTopic”, and click Save.

Message Hub create topic

We are now ready to use this new topic from Streams.

Using Streams to produce and consume messages

Now we will use the KafkaSASLSample Streams application to produce and consume messages.

Import KafkaSASLSample into Streams Studio from the downloaded toolkit directory (samples/KafkaSASLSample). The application should build successfully. Instructions for importing Streams applications can be found in Importing SPL projects.

We still need to tell the application where to find our Message Hub service.

  1. Navigate to Message Hub’s Bluemix dashboard and click the “Service Credentials” tab. Note the kafka_brokers_sasl, user, and password fields.Message Hub credentials
  2. Change line 1 of KafkaSASLSample/etc/consumer.properties and KafkaSASLSample/etc/producer.properties files, substituting broker.host.1:2181,broker.host.2:2181,broker.host.3:2181 for a comma-separated list of the kafka_brokers_sasl from the Message Hub service credentials.

    KafkaSASLSample/etc/consumer.properties

    bootstrap.servers=broker.host.1:2181,broker.host.2:2181,broker.host.3:2181
    group.id=mygroup
    client.id=myAPIKey
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    ssl.protocol=TLSv1.2
    ssl.enabled.protocols=TLSv1.2
    ssl.truststore.type=JKS
    ssl.endpoint.identification.algorithm=HTTPS
    

    KafkaSASLSample/etc/producer.properties

    bootstrap.servers=broker.host.1:2181,broker.host.2:2181,broker.host.3:2181
    acks=0
    client.id=myAPIKey
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    ssl.protocol=TLSv1.2
    ssl.enabled.protocols=TLSv1.2
    ssl.truststore.type=JKS
    ssl.endpoint.identification.algorithm=HTTPS

We also need to give authentication information to Message Hub by editing KafkaSASLSample/etc/jaas.conf.  In lines 4 and 5, replace myusername and mypassword with the user and password from your Message Hub service credentials.

KafkaSASLSample/etc/jaas.conf

KafkaClient {
    org.apache.kafka.common.security.plain.PlainLoginModule required
    serviceName="kafka"
    username="myusername"
    password="mypassword";
};

IMPORTANT: After saving producer.properties and consumer.properties, you must clean your workspace: Select Project -> Clean…, then press OK.

The KafkaSASLSample Streams application contains logic to both send and receive messages.

kafka-ssl-sample-graph

  • The “producer” part of the Streams graph (OutputStream → KafkaSinkOp) uses a KafkaProducer operator to send messages to the topic named “sampleTopic” every 0.2 seconds.
  • The “consumer” part (KafkaStream → SinkOp) retrieves messages from Kafka using the KafkaConsumer operator and prints them to the console.

Build the KafkaSASLSample’s Distributed Build so that you can run it on your Streaming Analytics service on Bluemix.

To run KafkaSASLSample locally in Streams Studio, see Launching a Main composite.

To view the locally running Streams graph, see Viewing the instance graph for your application.

Streams and Message Hub in the Cloud

Create a Streaming Analytics service on Bluemix – See “Finding the service” section of Introduction to Bluemix Streaming Analytics.

Building KafkaSASLSample creates a .sab file (Streams application bundle) in your workspace directory: workspace/KafkaSASLSample/output/com.ibm.streamsx.messaging.sample.kafka.KafkaSASLSample/Distributed/com.ibm.streamsx.messaging.sample.kafka.KafkaSASLSample.sab. This file includes all necessary information for the Streaming Analytics Bluemix service to run the Streams application in the cloud.

Upload the .sab file using the Streaming Analytics console.

  1. Head to the Streaming Analytics service dashboard in Bluemix and click “Launch” to launch the Streams console.
  2. Click “Submit job” under the “play icon” dropdown in the top-right of the consolestreams-submit-job
  3. Browse for the com.ibm.streamsx.messaging.sample.kafka.KafkaSASLSample.sab file that you built, and click Submit.streams-submit-job2

The Streams application is working properly if the Streams console’s graph view shows that all operators are healthy (green circle) and that approximately five tuples/second are flowing on each stream.

streams-graph

You can also view the messages being printed by SinkOp in the Streams log.

  1. Navigate to the Streams console log viewer on the far left.
  2. Expand the navigation tree and highlight the PE that has the SinkOp operator.
  3. Select the “Console Log” tab.
  4. Click “Load console messages”.

Streams console logs

If you don’t see any messages being logged, ensure that only one instance of the job is running. You can only have one Kafka consumer per topic in each consumer group.

If you had any trouble following these instructions, check out the video tutorial.

Conclusion

This article has shown how Streams can both send and receive messages using operators from the messaging toolkit. Message Hub provides powerful real-time communication between Streams and many other Bluemix services.

What’s next?

To learn more about Streaming Analytics visit:

Add Comment
One Comment

Leave a Reply

Your email address will not be published.Required fields are marked *


Leonid
More How-tos Stories

Building OpenWhisk actions with Java and Gradle

This post uses Gradle as a build tool to demonstrate how to build Apache OpenWhisk actions in Java.

Continue reading

Develop a REST Service using Play and PostgreSQL on IBM Bluemix: Part One

This post will show how to use the Play framework with IBM BlueMix to develop and run a very simple REST style web application. The application supports APIs to create, retrieve, update and delete objects represented by JSON in HTTP requests. On the backend, the application connects to a BlueMix managed PostgreSQL database to persist and query data parsed out by the application from the HTTP requests.

Continue reading

Best of Bluemix: Content repos, SSO, and containers

This regular post spotlights some of the best new tutorials, videos, and other content published each week on developerWorks. Today's entries explain how to integrate an IBM Bluemix application with an on-premises IBM Content Manager instance using the Bluemix Secure Gateway service, how companies can deploy web apps to Bluemix and restrict access to users who have credentials in the company’s single sign-on in less than an hour, and an introduction to some of today’s popular application delivery technologies associated with cloud computing (bare metal, VM, and container-based delivery).

Continue reading