Big data & analytics

The truth about real-time analytics in the era of big data

Share this post:

Every day we see the impact of real-time analytics and personalization on modern business. Mobile has been the ultimate catalyst, driving new customer experiences that are tailored to the individual, as data fuels a race to provide more value than competitors do.

The emergence of the open data economy is enabling access to entirely new business opportunities, but there are misconceptions around what it means to truly implement a real-time solution. I wanted to use this post to take a brief look at what it really takes to compete in this landscape where milliseconds matter, and expose some of the myths around the use of the term “real-time.”

The Strategy

Let’s say you are looking at making your operational business processes smarter. Your team has targeted two main battlefields: The first is identifying customer needs on the fly, to optimize services, and present them with relevant cross-sell/up-sell offers at the moment of engagement. The other is upgrading your payment systems with a more advanced kind of analytics that detects and rejects fraudulent, suspicious, or simply improper payment requests while in flight—but that doesn’t interrupt service unnecessarily.

After researching the required capabilities, you decide to integrate predictive analytics into your core operational systems, with a vision of using these models to score transactions as they occur. What many don’t realize when selecting their analytics applications is that there can be limitations based on the system on which they deploy those applications.

The Illusion of Real-Time

It’s very fashionable to hang the real-time tag on just about everything these days. But if a solution requires data to be copied and moved so the analytics can be invoked on some other platform, the result will be sub optimized because the analysis is being performed on old data. You can have the best strategy and the most advanced tools in place, but if you’re working with data that isn’t live you’re not getting the best result. And if you can’t integrate analytics at the transaction level while the transaction is in flight, it’s like trying to stop fraud after the transaction has already happened, or suggesting a product to an individual after they have already left your site. These limitations can stifle innovation and leave competitors with an open lane to provide better-quality service on the back of better customer insight.

So Why Does This Happen?

It all comes down to the ability to do in-transaction analytics processing. Many projects with a vision of real-time analytics and processing just can’t process the data fast enough because they don’t take infrastructure into account when the solution is architected. If you are not running your analytics in the same location as your data, you have to move it and process the analysis as a remote call. The extra time that you lose when data has to be moved and processed off-platform has a very real cost, especially on projects that need truly real-time responsiveness at their core.

And lack of speed is not the only cost of moving your data somewhere else in order to process it; your security is also threatened when a system requires you to take data off-platform for analysis. Disaster recovery, business continuity and data integrity all come under pressure—as you are essentiality losing control of the foundations of your project.

These pitfalls are based on an outdated understanding of available options for matching functional capabilities to the underlying IT infrastructure. Discussions need to be had about the limitations of executing analytics outside a company’s core systems, and the reality of what it takes to deliver a competitive real-time solution.

The Right Approach: Move Analytics to the Data, Not the Other Way Around

When it comes down to it, every millisecond matters when injecting intelligence into operational systems. Look for ways to leverage your technology to perform transaction analysis in flight. The key way to do so when designing a project is to locate the transactional data at the heart of it, understand the current SLAs for the operations associated with that data, then ensure the added processing will not go against the terms of those SLAs. It’s the clearest way to identify whether your analytics will be running truly real-time or sort-of real-time, on truly fresh and accurate data or on aged data.

Infrastructure can make or break any analytics project—look to integrate your analytics processing directly on the same infrastructure as your core operational systems in order to save those milliseconds!

It’s all about knowing your options so you can understand the value you will be able to provide the business today and down the line. For more information, I recommend the “Decision Management Solutions” paper available below, and would love to discuss it further in the comments section provided there.

decision management solutions

Worldwide Portfolio Marketing Manager, IBM z Systems Big Data and Analytics

Add Comment
No Comments

Leave a Reply

Your email address will not be published.Required fields are marked *

More Workload & resource optimization stories

The latest on IBM Z and LinuxONE: Learn more at IBM TechU

The market is abuzz with the latest IBM Z and LinuxONE announcements.  The new single frame 19-inch z14 and LinuxONE are here, with air flow, storage and system integrated into a standard rack. That means the ability to process over 850 million fully encrypted transactions in a single system that takes up the space of […]

Continue reading

Key to digital growth: Always learning new skills

Line of business (LOB) leaders and their application developers are applying a variety of cloud technologies for competitive advantage. IDC[1] predicts that by 2020, over 90 percent of enterprises will use multiple cloud services and platforms, with more than one third of these organizations having established mechanisms to operate their multi-cloud environments. Are you and […]

Continue reading

5 key reasons to opt for hyperconverged infrastructure

Let’s face it—traditional design for data center infrastructure doesn’t always meet today’s needs. Analytics, transaction processing, high-volume web serving and other data-intensive and mission-critical workloads place new and increased demands on the IT infrastructure. At the same time, specialized IT skills are becoming increasingly challenging to find, and budgets for capital IT expenses continue to […]

Continue reading