Getting Started with Streaming Analytics and RabbitMQ

Share this post:

Often, real-time analytic applications need to ingest data from and/or egress data to a messaging system. We’re happy to announce that Streaming Analytics now supports integration with RabbitMQ. RabbitMQ provides robust messaging that implements the Advanced Message Queuing Protocol (AMQP), enabling communication between cloud-based services. This tutorial explains how to flow messages between the Streaming Analytics service and the CloudAMQP service in Bluemix.


  • Create a Streaming Analytics service on Bluemix; See “Finding the service” section of Introduction to Bluemix Streaming Analytics.
  • Create CloudAMQP service on Bluemix, in the same way you found the Streaming Analytics service, look for CloudAMQP instead, while creating the service choose the plan that is right for your needs.
  • To complete this tutorial, you will need a Streams development environment. If you do not have a Streams development environment, you can download the IBM InfoSphere Streams Quick Start Edition (Download Trial). This provides a full Streams environment for you to develop in.

We are now ready to install the Streams Messaging Toolkit.

Installing the Streams Messaging Toolkit

We will now get started with the Streams Messaging Toolkit.

  • In your Streams development environment download the Streams Messaging Toolkit using the “Download Zip” Button from GitHub and extract it.
  • Open a terminal and run an Ant Build against the Toolkit by typing ant in the base directory that has a build.xml file in it and press Enter.
  • Start “Streams Studio” which is the Streams IDE. If you are using the Streams Quick Start VM there will be a shortcut on the desktop.
  • In Streams Explorer, add another toolkit by locating “Toolkit Locations” under the “Streams Explorer” tab and right click, then select “Add Toolkit Location…”
    Add Toolkits
  • Select “Directory…”, navigate to the location you extracted the Streams Messaging Toolkit to and select the directory, press OK. The location you selected will appear in the list of Toolkit locations.
    Add Toolkit Location

We can now use Streams and RabbitMQ together!

Using Streams to Produce and Consume Messages

We will now use the RabbitMQSample Streams application to produce as well as consume messages. Import “RabbitMQSample” into Streams Studio from the extracted toolkit directory (samples/RabbitMQSample). The application should build successfully. Instructions for importing Streams applications can be found in Importing SPL projects.

We still need to add the connection information for our Streams sample to communicate with our CloudAMQP service.

  • In Bluemix navigate to your CloudAMQP service and click on the “Open CloudAMQP Dashboard” button which will launch the Dashboard in a different window. You must gather the URL listed on the “Details” tab. You can also view your RabbitMQ metrics through this dashboard. Guidance on navigating the dashboard can be found at the CloudAMQP Documentation Page.CloudAMQPDashboardButton
  • We must change the RabbitMQSample.spl file to include URI for connection, to do this we must open the RabbitMQSample.spl file in the editor. To do this go to Project Explorer -> RabbitMQSample -> Resources -> -> RabbitMQSample.spl, and then right-click RabbitMQSample.spl -> Open With -> SPL Editor.
  • Update the RabbitMQSink and RabbitMQSource operators. We will use the URI provided by the bluemix application above to add the URI param.RabbitMQSampleCode
    The RabbitMQSample Streams application contains logic to both send and receive messages with RabbitMQ.
  • The “Source” part of the Streams graph (RabbitMQStream → SinkOp) retrieves messages from RabbitMQ using the RabbitMQSource operator and prints them to the console.
  • The “Sink” part of the Streams graph (OutputStream → RabbitMQSinkOp) uses the RabbitMQSink Operator to send messages to the RabbitMQ exchange called “myPhilosophicalExchange”.
  • To run RabbitMQSample locally in Streams Studio, see Launching a Main composite.
  • To view the locally running Streams graph, see Viewing the instance graph for your application.

Streams and RabbitMQ in the Cloud

Building the RabbitMQSample creates a Streams Application Bundle (sab) file in your workspace directory:


This file includes all the necessary information to be used by the Streaming Analytics Bluemix service to run the Streams application in the cloud. Submit the sab file using the Streaming Analytics console.

    • Go to the Streaming Analytics service dashboard in Bluemix and click “Launch” to launch the Streams console for your streams instance.
    • Click “Submit Job” under the “play icon” dropdown in the top-right of the console.streams-submit-job
    • Browse for the file that you built, then click Submit and wait for the submission to complete.JobSubmission

The Streams application is working properly if the Streams console’s graph view shows that all operators are healthy (Green) and it is showing tuples flowing on each stream.
Job Graph

You can also view the messages being printed by the SinkOp in the Streams log.

  • Navigate to the Streams console log viewer on the far left (page icon).
  • Expand the navigation tree and highlight the PE that has the SinkOp operator.
  • Select the “Console Log” tab.
  • Click “Load console messages”.


If you don’t see any messages being logged, make sure that the instance and job are both running in a healthy state. When you are finished running this job remember to cancel your job and stop your instance.

Next Steps

Now that you have run this sample and completed integration between Streaming Analytics and CloudAQMP, you can build upon this sample to to meet the specific requirements of your own Bluemix application.

More How-tos stories
May 1, 2019

Two Tutorials: Plan, Create, and Update Deployment Environments with Terraform

Multiple environments are pretty common in a project when building a solution. They support the different phases of the development cycle and the slight differences between the environments, like capacity, networking, credentials, and log verbosity. These two tutorials will show you how to manage the environments with Terraform.

Continue reading

April 29, 2019

Transforming Customer Experiences with AI Services (Part 1)

This is an experience from a recent customer engagement on transcribing customer conversations using IBM Watson AI services.

Continue reading

April 26, 2019

Analyze Logs and Monitor the Health of a Kubernetes Application with LogDNA and Sysdig

This post is an excerpt from a tutorial that shows how the IBM Log Analysis with LogDNA service can be used to configure and access logs of a Kubernetes application that is deployed on IBM Cloud.

Continue reading