It’s no surprise:  most companies working with stream data today say they are planning to make changes to drive greater value. Advancements in machine learning (ML) and very-high-speed data persistence for real-time analytics are reshaping strategies and architectures. In addition, 88 percent of surveyed companies say they need to perform analytics in near-real time on stored streamed data.

For that reason, it’s important for businesses to investigate the recent technological improvements across streaming analytics, data persistence speeds and object storage to get more from streamed data with less complexity and cost. I provide a head start below.

Faster and more intelligent data analysis

Storage and analysis of high volume, high velocity streaming IoT data has seen considerable progress. As much as 250 billion events per day with just 3 nodes can be stored and analyzed using one recently released event store. Achieving the same speed with prior, popular event stores required nearly 100 nodes. Moreover, data and analytics acceleration technologies are available in some event stores. Combined, this means that insights can be delivered faster with fewer resources – reducing cost while improving performance.

More than 70 percent of companies surveyed also indicated that they use, or plan to use, machine learning with streamed and stored data. Therefore, the most useful event stores will have built-in, fully-integrated support for machine learning and Jupyter Notebooks. This means they will include technology and environments which makes it easier to develop, train, and manage ML models, and also to deploy AI-powered applications. Popular languages for building AI and ML apps, including Python and Scala, should also be supported.

By fully supporting machine learning and reducing integration issues efficiency is increased for data scientist and developers. So, they can easily use the massive volumes of rapidly arriving stream data to build and train machine learning models with the knowledge base they need—ultimately resulting in more accurate insight.

Using low-cost object storage

The type of storage an event store uses also matters. Low-cost object storage is growing in favor as a home for large amounts of streaming data. It’s a good choice for fast analytics on streamed stored data, as it can be faster and cost less than other popular big data environments such as Hadoop. The best event stores also put the data in an open formatlike Apache Parquet, which is commonly used for object storage and accessible by many applications. Unlike other options for persisting streamed data, using an open format like Apache Parquet helps avoid vendor lock-in, and makes it easier to put the object storage in a public cloud, private cloud or on-premises as needed. Sharing a common SQL engine with other data management solutions is also helpful to access data throughout your hybrid data management architecture.

But to achieve near real-time analytics speed on streamed data persisted in object storage, you can’t just dump the data there. Your event store must optimize the data for real-time analytics as it quickly writes it out to object storage. Memory-optimized object storage for streamed events can yield exceptionally fast analytics on persisted streamed data, making it possible to combine incoming stream data with stored data for deeper insights. This is critically important for use cases that require smart, fast decisions that incorporate historical data such as improving alert monitoring. And even more powerful streaming analytics can be achieved by using results from real-time queries.

An alert monitoring use case for IoT

IoT is one area that has benefited considerably from the advancements in rapid analytics on both the streaming and stored data. As mentioned above, intelligent alert monitoring is a popular use of this newer capability. IoT devices help monitor critical systems by transmitting large volumes of event data at high speed for the purpose of real-time alerts. These alerts help avoid major disruptions, or even disasters that may cause greater repercussions.

However, just analyzing the incoming stream may not be enough.  For deeper insights and the ability to act on initial analytics results, real-time analytics must be performed on recently stored or historical event data as well. This provides a more complete picture that helps more accurately predict when alerts should be enacted – lowering the chance of disruption and disaster while saving time that would have gone to false alarms.

Bringing the pieces together

Clients have demonstrated a strong preference for enhanced open source solutions that reduce complexity, integration and maintenance issues that arise with pure build-your-own environments. This is especially important as applications move beyond early experimentation to business-critical uses that reduce operational costs and risks and help differentiate customer service. As companies build or enhance their streaming analytics architectures to take better advantage of high-volume, high-velocity data, they will need a modern “fast data” environment.

Businesses must combine streaming tools with a high-speed event store and low-cost object storage. Their fast data environment must also include built-in tools to create and score machine learning models for data scientists focused on analyzing streamed data and developers building event-driven applications. The ability to write to an open data format in an optimized way for rapid analytics and a common SQL engine that enables application data access without code rewrites and support for complex queries is also vital. Going forward, I predict companies embracing a fast data approach will find new and enhanced use cases for high-speed, streamed data persistence with an event store that offers speed, accessibility, integrated AI and machine learning support—and value for IoT and other types of event-driven applications.

To learn more about how companies are rethinking streamed data, read Forrester’s recent survey conducted on behalf of IBM. You can get started at no cost with a solution which delivers on the promise of fast data: IBM Db2 Event Store. And if you’re interested in how developers can deliver AI applications faster, watch our webcast.

Was this article helpful?

More from Analytics

Reimagine data sharing with IBM Data Product Hub

3 min read - We are excited to announce the launch of IBM® Data Product Hub, a modern data sharing solution designed to accelerate data-driven outcomes across your organization. Today, we're making this product generally available to our clients across the world, following its announcement at the IBM Think conference in May 2024. Data sharing has become the lifeblood of modern organizations, fueling growth and driving innovation. But traditional approaches to data sharing can often be a bottleneck constricting the seamless sharing of data.…

In preview now: IBM watsonx BI Assistant is your AI-powered business analyst and advisor

3 min read - The business intelligence (BI) software market is projected to surge to USD 27.9 billion by 2027, yet only 30% of employees use these tools for decision-making. This gap between investment and usage highlights a significant missed opportunity. The primary hurdle in adopting BI tools is their complexity. Traditional BI tools, while powerful, are often too complex and slow for effective decision-making. Business decision-makers need insights tailored to their specific business contexts, not complex dashboards that are difficult to navigate. Organizations…

IBM unveils Data Product Hub to enable organization-wide data sharing and discovery

2 min read - Today, IBM announces Data Product Hub, a data sharing solution which will be generally available in June 2024 to help accelerate enterprises’ data-driven outcomes by streamlining data sharing between internal data producers and data consumers. Often, organizations want to derive value from their data but are hindered by it being inaccessible, sprawled across different sources and tools, and hard to interpret and consume. Current approaches to managing data requests require manual data transformation and delivery, which can be time-consuming and…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters