Data Analytics

Get Started with Streaming Analytics + Message Hub

Share this post:

Message Hub provides a simple communication mechanism built on Apache Kafka, enabling communication between loosely coupled Bluemix services. This article shows how to communicate with Message Hub from the Streaming Analytics Bluemix service using the messaging toolkit.

Setup

Creating a topic

In Message Hub, messages are transported through feeds called topics. Producers write messages to topics, and consumers read from topics.

The Message Hub Bluemix dashboard provides topic management tools. To create a topic click the plus sign, enter “test”, and click “Create topic”.

Message Hub create topic

We are now ready to use this new topic from Streams.

Using Streams to produce and consume messages

Now we will use the MessageHubFileSample Streams application to produce and consume messages.

Import MessageHubFileSample into Streams Studio from the downloaded toolkit directory (samples/MessageHubFileSample). The application should build successfully. Instructions for importing Streams applications can be found in Importing SPL projects.

We still need to tell the application where to find our Message Hub service.

  1. Navigate to Message Hub’s Bluemix dashboard and click the “Service Credentials” tab. Message Hub credentials
  2. Copy the credentials JSON and paste it into MessageHubFileSample’s /etc/messagehub.json file, replacing the placeholder comment.The MessageHubFileSample Streams application contains logic to both send and receive messages.

    Message Hub Sample streams graph

    • The “producer” part of the Streams graph (Beacon_1 → MessageHubProducer_2) uses a MessageHubProducer operator to send messages to the topic named “test” every 0.2 seconds.
    • The “consumer” part (MessageHubConsumer_3 → Custom_4) retrieves messages from Kafka using the MessageHubConsumer operator and prints them to the console.

    Build the MessageHubFileSample’s Distributed Build so that you can run it on your Streaming Analytics service on Bluemix.

    To run MessageHubFileSample locally in Streams Studio, see Launching a Main composite.

    To view the locally running Streams graph, see Viewing the instance graph for your application.

    Streams and Message Hub in the Cloud

    Create a Streaming Analytics service on Bluemix – See “Finding the service” section of Introduction to Bluemix Streaming Analytics.

    Building MessageHubFileSample creates a .sab file (Streams application bundle) in your workspace directory: workspace/MessageHubFileSample/output/com.ibm.streamsx.messagehub.sample.MessageHubFileSample/BuildConfig/com.ibm.streamsx.messagehub.sample.MessageHubFileSample.sab. This file includes all necessary information for the Streaming Analytics Bluemix service to run the Streams application in the cloud.

    Upload the .sab file using the Streaming Analytics console.

    1. Head to the Streaming Analytics service dashboard in Bluemix and click “Launch” to launch the Streams console.
    2. Click “Submit job” under the “play icon” dropdown in the top-right of the consolestreams-submit-job
    3. Browse for the com.ibm.streamsx.messagehub.sample.MessageHubFileSample.sab file that you built, and click Submit.Streams submit job

    The Streams application is working properly if the Streams console’s graph view shows that all operators are healthy (green circle).

    streams graph

    You can also view the messages being printed by Custom_4 in the Streams log.

    1. Navigate to the Streams console log viewer on the far left.
    2. Expand the navigation tree and highlight the PE that has the Custom_4 operator.
    3. Select the “Console Log” tab.
    4. Click “Load console messages”.

    Streams console logs

    If you don’t see any messages being logged, ensure that only one instance of the job is running. You can only have one Kafka consumer per topic in each consumer group.

    If you had any trouble following these instructions, check out the video tutorial.

    Conclusion

    This article has shown how Streams can both send and receive messages using operators from the messaging toolkit. Message Hub provides powerful real-time communication between Streams and many other Bluemix services.

    What’s next?

    To learn more about Streaming Analytics visit:

More Data Analytics stories

How to Layout Big Data in IBM Cloud Object Storage for Spark SQL

When you have vast quantities of rectangular data, the way you lay it out on object storage systems like IBM Cloud Object Storage (COS) makes a big difference to both cost and performance of SQL queries. However, this task is not as simple as it sounds. Here we survey some tricks of the trade.

Continue reading

New features for IBM Cloudant

Over the past few weeks, we have been rolling out the ability for IBM Cloudant users to have more control over their Cloudant Standard plans and provisioned throughput capacity. Customers will have the ability to more accurately set the provisioned throughput capacity according to the needs of their applications. The Standard plan is backed by 99.95% SLA at all prices points and changes to provisioned throughput capacity are usually available within seconds.

Continue reading

PowerAI on IBM Cloud

PowerAI is now generally available in IBM Cloud! PowerAI is a software distribution of ML/DL frameworks that makes high performance deep learning, machine learning, and AI development more accessible. Customers can leverage IBM® Power Systems™ and NVIDIA® GPUs to rapidly deploy fully optimized and supported instances for ML/DL with blazing speed. Developers can deploy PowerAI […]

Continue reading