November 4, 2014 | Written by: Paul DiMarzio
Share this post:
My generation grew up in a world where the institutions with which we interacted held all the cards. We were told what we wanted, how we could get it and how much it would cost. And if we weren’t happy, well, there really wasn’t any great mechanism for expressing that.
From a business perspective, offers would be created based on fairly coarse-grained demographics, and then “mad men” and sales teams would be unleashed to generate interest and close deals. It was a very controlled, segmented, batch-oriented world, and the information technology (IT) systems put in place to support the business mirrored this worldview.
The Internet started to shift the base of power from business to consumer, and mobile devices completed this shift. The way in which we interact with the world has permanently changed, and everything, from how we transact with merchants, to how we entertain ourselves, to how we manage our finances, is pretty much under our control now.
Leading-edge enterprises understand this and know that they have to firmly integrate context-aware analytics with their transactional systems in order to drive growth by serving the “demographic of one” more effectively. The focus is now on the customer experience, and the new objective is understanding customer wants and needs and then crafting offers that match them.
Unfortunately, knowledge and execution can be mutually exclusive. A lot of studies show that most organizations still struggle to cater to the individual. C-suite executives talk about serving that demographic of one, but most still blast out undifferentiated campaigns to everyone. They want to use data to drive growth, but only a very small percentage is really satisfied with their efforts so far. Most organizations know where they want or need to head but are having a hard time getting there. I believe this is because their IT systems have not kept pace with the seismic shift that has occurred in the business world.
The problem revolves around how most organizations view data. Effectively serving the individual will typically require a holistic joining of multiple views of data:
- The real-time view that serves the transactions, what is happening right now
- The historical view that serves an understanding of what went on in the past
- The predictive view that serves a projection of what is likely to happen in the future based on patterns from the past coupled with knowledge about the present
The issue is that the topology you find on most IT floors does not hold to this holistic view. You find a highly fragmented infrastructure. IBM System z mainframes are thought to hold 70 to 80 percent or more of the world’s transactional data—the real-time view—but that data is then copied off-platform, usually multiple times, to create those historical and predictive views. It’s this fragmentation that prevents most clients from moving forward with their growth and optimization objectives.
The image above is a photo of a whiteboard representing a portion of a data lifecycle review that we did for a client. They wanted to incorporate real-time scoring—shown here by the red circle—into a card processing system. It was just impossible with the distributed approach that they had taken for their systems.
Latency, complexity, lack of data controls—these are the things that cause the misalignment between business objectives and IT reality. Not to mention the extreme cost involved in pushing all that data around. This sort of topology served the old world well, but to push forward into the fully connected world requires a different way of thinking.
In order to really serve that demographic of one, business leaders require real-time, accurate insights in the context of operational business processes—at the moment of engagement with the customer, not after the fact. This requires a different way of thinking about transactions, analytics and data: they must be fully integrated both on the IT floor as well as within the context of the business process. This is the only way to eliminate the latency, complexity—and cost—of pushing data around to align with an outdated view of the world.
In addition to the fact that System z holds 70 to 80 percent of the world’s corporate data, 55 percent of enterprise apps touch System z data and 91 percent of new client-facing apps will require System z to complete transactions. Given that the mainframe is the data and transaction hub for the global economy today, it must be front and center in the design of true real-time analytics systems because that’s the only way you can possibly execute analytics where the transactions are taking place.
The System z team has been driving our technology roadmap to deliver products that enable insights on every transaction that we process, enabling our clients to truly begin to integrate analytics into the flow of business. Something very new and very important is happening in the System z world!
We’re tightening up the create-transform-report-analyze lifecycle of data to squeeze out the latency, complexity and cost by collapsing the necessary views of data into one holistic view. We’re also driving the right tools into the platform to blur the lines between transactions and analytics. When the data is on System z, the analytics should be fused directly into the operational systems on System z that act on that data.
If you still hold to the old adage that “you can’t do analytics on the mainframe,” I encourage you to take a look at what you’ve been missing. You may never think of operations and analytics as distinct ever again!