Data Analytics

Get Started with Streaming Analytics + Message Hub

Share this post:

Message Hub provides a simple communication mechanism built on Apache Kafka, enabling communication between loosely coupled IBM Cloud services. This article shows how to communicate with Message Hub from the IBM Cloud Streaming Analytics service using the messaging toolkit.

Setup

Creating a topic

In Message Hub, messages are transported through feeds called topics. Producers write messages to topics, and consumers read from topics.

The IBM Cloud Message Hub dashboard provides topic management tools. To create a topic click the plus sign, enter “test”, and click “Create topic”.

Message Hub create topic

We are now ready to use this new topic from Streams.

Using Streams to produce and consume messages

Now we will use the MessageHubFileSample Streams application to produce and consume messages.

Import MessageHubFileSample into Streams Studio from the downloaded toolkit directory (samples/MessageHubFileSample). The application should build successfully. Instructions for importing Streams applications can be found in Importing SPL projects.

We still need to tell the application where to find our Message Hub service.

  1. Navigate to Message Hub’s dashboard and click the “Service Credentials” tab. Message Hub credentials
  2. Copy the credentials JSON and paste it into MessageHubFileSample’s /etc/messagehub.json file, replacing the placeholder comment.The MessageHubFileSample Streams application contains logic to both send and receive messages.Message Hub Sample streams graph
    • The “producer” part of the Streams graph (Beacon_1 → MessageHubProducer_2) uses a MessageHubProducer operator to send messages to the topic named “test” every 0.2 seconds.
    • The “consumer” part (MessageHubConsumer_3 → Custom_4) retrieves messages from Kafka using the MessageHubConsumer operator and prints them to the console.

    Build the MessageHubFileSample’s Distributed Build so that you can run it on your Streaming Analytics service on IBM Cloud.

    To run MessageHubFileSample locally in Streams Studio, see Launching a Main composite.

    To view the locally running Streams graph, see Viewing the instance graph for your application.

    Streams and Message Hub in the Cloud

    Create a Streaming Analytics service on IBM Cloud – See “Finding the service” section of Introduction to IBM Cloud Streaming Analytics.

    Building MessageHubFileSample creates a .sab file (Streams application bundle) in your workspace directory: workspace/MessageHubFileSample/output/com.ibm.streamsx.messagehub.sample.MessageHubFileSample/BuildConfig/com.ibm.streamsx.messagehub.sample.MessageHubFileSample.sab. This file includes all necessary information for the Streaming Analytics service to run the Streams application in the cloud.

    Upload the .sab file using the Streaming Analytics console.

    1. Head to the Streaming Analytics service dashboard in IBM Cloud and click “Launch” to launch the Streams console.
    2. Click “Submit job” under the “play icon” dropdown in the top-right of the consolestreams-submit-job
    3. Browse for the com.ibm.streamsx.messagehub.sample.MessageHubFileSample.sab file that you built, and click Submit.Streams submit job

    The Streams application is working properly if the Streams console’s graph view shows that all operators are healthy (green circle).

    streams graph

    You can also view the messages being printed by Custom_4 in the Streams log.

    1. Navigate to the Streams console log viewer on the far left.
    2. Expand the navigation tree and highlight the PE that has the Custom_4 operator.
    3. Select the “Console Log” tab.
    4. Click “Load console messages”.

    Streams console logs

    If you don’t see any messages being logged, ensure that only one instance of the job is running. You can only have one Kafka consumer per topic in each consumer group.

    If you had any trouble following these instructions, check out the video tutorial.

    Conclusion

    This article has shown how Streams can both send and receive messages using operators from the messaging toolkit. Message Hub provides powerful real-time communication between Streams and many other IBM Cloud services.

    What’s next?

    To learn more about Streaming Analytics visit:

InfoSphere Streams Cloud

More Data Analytics stories
March 27, 2019

Db2 Warehouse Flex Comes to Amazon Web Services (AWS)

In a strategic escalation in our approach to cloud data warehousing, we’re bringing Db2 Warehouse Flex to Amazon Web Services as a fully managed, scalable, and elastic cloud data warehouse.

Continue reading

March 12, 2019

Expanding Data Warehouse Capabilities for the IBM Hybrid Data Management Platform

The IBM Hybrid Data Management Platform is expanding capabilities with both the Flex and Hybrid Flex plans. These two types of warehousing solutions will help you optimize your hybrid cloud architectures in terms of both performance and cost-savings

Continue reading

March 5, 2019

Deprecation of Apache Spark (Lite Plan)

We’d like to inform you about the deprecation of the Apache Spark (Lite plan) service. The Lite plan of this service will be retired on June 28, 2019. Please note that the Enterprise plan has already been deprecated.

Continue reading