How IBM uses the mainframe to bring analytics back to the future – Part 2

Share this post:


DeLoreanIn part 1, I recounted my use of a time machine (you really don’t have one?) to research how IT shops architected their business analytics at several specific points in history. I wanted to illustrate how events of the past prompted IBM to initiate Blue Insight—a project that consolidated all IBM internal analytics processing onto System z Linux.

At the end of part 1 I had just started to recount my present-day discussion with Larry Yarter, the IBMer who is mainly responsible for Blue Insight.  I wanted to know if this System z-based private cloud model was working, and I asked Larry for some tangible proof points.  Here’s what he told me:

  • Immediate payback. Initial savings to the company, simply from unplugging all those old Brio servers and replacing them with System z, has been figured to be in the range of $25 million over five years. These were hard savings: floor space, environmentals, networking gear, provider contracts and so on. Larry pointed out that $35–$50 million in “soft” savings was also achieved by avoiding project-by-project analytics infrastructure costs that would have been required to accommodate new transformational investments.
  • Cost avoidance. When Blue Insights first went live, Larry and his team were able to onboard a new project of 5,000 users in a couple of weeks for a cost of around $25,000. Today, a new project onboards in days for a cost of around $13,000. Compare this to a typical time of six to eight months and a cost of up to $250,000 to get the same project running on dedicated servers: we’re talking about providing the same capabilities at pennies on the dollar, and in a fraction of the time.
  • Extreme virtualization. Today Blue Insights supports approximately 500 projects, comprising 200,000 named users, and drawing on 300–400 data sources. All on—get this—just two production instances of Cognos installed on System z Linux. And the main reason there are two instances, not one, is to keep internal and business partner users separated. It only takes around two dozen staff to support and operate the entire environment.
  • Extreme insights. The analytics performed on these systems have generated hundreds of millions of dollars’ worth of insights for IBM. Over 3.3 million reports were run in 4Q2012, with a peak daily usage of 89,386 reports run on February 28th of last year.

I have far too many notes to fit into this brief blog post. I highly recommend that you check out a few videos describing the project called “IBM System z Blue Insights Buyers Journey”: Challenges (Part 1 of 2) and Solution (Part 2 of 2). You might also want to read this case study.

I asked Larry what the “next big thing” would be at Blue Insights. Larry described some really cool projects, such as integration of additional data and analytics services for “big” social data, but the one that really caught my attention was the planned introduction of analytics acceleration appliances.

Today, the majority of the hundreds of data sources being analyzed with Blue Insights resides on DB2 z/OS. So Larry and his team are in the process of installing some IBM DB2 Analytics Accelerators to deliver significant acceleration of those workloads. I’ve seen firsthand how Swiss Re and Aetna have benefitted from these appliances, and I was really excited to learn that IBM would be getting the same benefit! If you’re not familiar with the Analytics Accelerator, my colleague Niek de Greef published an excellent blog post on the topic.

mainframesBlue Insights supplies IBM’s analysts with the same flexibility and support for creative analysis as before, but at a fraction of the cost and with more IT responsiveness. This is done by creating a clear distinction between well-defined services (the infrastructure, software and service level provided by Blue Insights) and business solutions (the data and creative analysis provided by users).

And although there are some clear and tangible cost savings produced by running this private, service-based cloud on System z, that’s not actually the main reason why Blue Insights is based on the mainframe. As Larry says, “we chose System z because it provides the best platform for enterprise-scale analytics as a share-all private cloud.”

Back to the Future

Back in the 1990s, analytics was not integral to business operations. It was ok—and in fact architecturally recommended—to copy all the data out to distributed systems and perform the analytics on a best-can-do basis. These days this is no longer a viable model; analytics is becoming integral to the day-to-day performance of an enterprise, and organizations are starting to realize that to be truly effective and differentiated they must demand the same service levels from their analytics systems that they do for their operational systems (I described this shift in my MythBusters Episode I post).

Traditional, expensive data warehousing philosophies that copy data out of mainframes to distributed systems in formats specifically suited to analysts’ needs are being challenged. The advent of analytics appliances and “in memory” techniques allow analytics to be performed closer to the source data, in its original format, in real time.

The mainframe is really the only platform equipped to deliver access to both real-time and historical information, on a single architecture, using a centralized suite of tools, while simultaneously managing both transactional and analytical processing of that data.

It’s 1976 all over again. But better!

How is your data center architected for analytics? Are you still following a 1990s distributed model? Are you having difficulty managing legions of stand-alone servers, as IBM was in 2007? If so, you may want to consider taking a trip back to the future and bringing that processing back to where the source data is—back to the mainframe.

More stories

IBM z15 sets a new cloud security standard

Encryption, Mainframes, Multicloud

From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends to the IBM Systems IT Infrastructure blog. The opinions in these posts are their own, and do not necessarily reflect the views of IBM. On September 12, 2019, in New York City, IBM set a new more

Address customer data privacy and protection concerns with encryption everywhere

Data centers, Data security, Mainframes

Data privacy and protection remain top boardroom and C-suite issues. The data breach threat still looms large: 59 percent of businesses experienced a data breach caused by a vendor or third party in 2018[1]. As organizations migrate workloads to hybrid multicloud environments, they must ensure that the data within these environments is effectively protected. Consumers more

Mentoring, music & mainframes

Academic initiatives, Mainframes, Modern data platforms

A story of DJs who power the mainframe community From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends to the IT Infrastructure blog. The opinions in these blogs are their own, and do not necessarily reflect the views of IBM. Since my inception into the more