Tutorial: Create a streams flow from a Data Historian example flow
Learning objective
In this tutorial, you learn how to create and run an example streams flow. We supply the Data Historian example streams flow and the sample data. You don’t need to configure anything.
This tutorial is a high-level, bird’s eye view of a streams flow that takes approximately 10 minutes to finish. Other tutorials do an in-depth examination into the canvas, Metrics page, and operators.
Overview
The sample data is taken from five weather stations. The data includes weather station ID, time zone, date in Universal Coordinated Time (UTC) format, latitude, longitude, temperature, barometric pressure, humidity, indoor temperature, and rainfall today.
The Data Historian example streams flow has two Aggregation operators:
-
The first Aggregation operator partitions the incoming data by weather station ID. Consequently, each weather station has a partition. Within each partition, the data is grouped by weather station. As a result, every partition has one group.
Every 60 seconds, the data “tumbles out” and a designated function is applied to data in each group. For example, the Average function is applied to rainfall data, but the Min function is applied to the barometric pressure data.
-
The second Aggregation operator ingests the output of the first Aggregation operator. It partitions and groups the data just like the first Aggregation operator, but the data “tumbles out” every 180 seconds.
Output data from the second Aggregation operator is sent to a Debug operator for further analysis.
Preview
Now it’s your turn - try out the following tutorial steps in your own environment.
Prerequisites
You must have a Streams service up and running.
Create the example streams flow
Create a streams flow by using the Data Historian example streams flow and its sample data.
-
In the New Streams Flow page, click the From example tab.
-
Perform the following steps:
- The example streams flow automatically completes the Name and Description fields, so leave them blank.
- In the Streams service list, the Streams service that is associated with the project is already selected. Here’s an example:
-
In the Select Example area, click the Data Historian Example box. Note that the Name and Description fields are now filled.
-
In the File path field, append the string /%TIME. The variable %TIME appends the time to the file name to make it unique. Add the file extension
.parquet
because partitioning uses parquet files.
Run the example streams flow
Your new Data Historian example streams flow is automatically shown in the Metrics page. The Status indicator shows that it is in a Stopped state.
Click the Run icon to start the streams flow. The Status indicator reflects the changing stages as the streams flow is deployed.
Notice that until the status is “Running”, the streams flow is static and uses arrows to connect operators.
When the status is “Running, you can see the data as it flows between operators. Put your mouse pointer over a data flow to get real-time metrics. Click the data flow to see the events flowing to the next operator.
Summary
You just created a streams flow from the Data Historian example flow and its sample data. You started the streams flow in the Metrics page and saw weather station data that flows between operators.
Learn more
Learn more about the Data Historian example streams flow – its scope, its operators, and its output - in the topic Data Historian example streams flow.
For more information about specific features and operators, check out our other tutorials for streams flow.