Published: 1 March 2024
Contributors: Stephanie Susnjara, Ian Smalley
Mainframes are data servers designed to process up to 1 trillion web transactions daily with the highest levels of security and reliability.
At their core, mainframes are high-performance computers with large amounts of memory and data processors that process billions of simple calculations and transactions in real-time. A mainframe computer is critical to commercial databases, transaction servers and applications that require high resiliency, security and agility.
Since the advent of the internet and the rise of cloud computing, some may think of the mainframe as a tech dinosaur. On the contrary, the mainframe has evolved to keep pace with other technologies and plays a vital role in IT infrastructure.
In a recent IBM report, 45 of the top 50 banks, 4 of the top 5 airlines, 7 of the top 10 global retailers, and 67 of the Fortune 100 companies leverage the mainframe as their core platform. Moreover, a study from the IBM Institute of Business Value (IBV) showed that mainframes handle almost 70% of the world’s production IT workloads, and 70% of executives surveyed believe mainframe-based applications are central to their business strategy.
The term mainframe initially referred to the large cabinet or ‘main frame’ that held the central processing unit (CPU) of early computer systems. The mainframe served as a central data repository or ‘hub’ linking workstations or terminals in an organization’s data processing center. A centralized computing environment has given way to a more distributed computing environment as mainframes have become smaller and gained more processing power to become more flexible and multipurpose. Today’s mainframes process and store massive amounts of data and are called enterprise servers (or data servers).
Assess your mainframe application modernization readiness to uncover your path to digital transformation with IBM.
Subscribe to the IBM Newsletter
Early mainframe systems filled room-sized metal frames that could occupy between 2,000 to 10,000 square feet. These enormous machines required vast amounts of electrical power, air conditioning and reams of input/output (I/O) devices. Today’s mainframes are much smaller than early “Big Iron” machines and are about the size of a large refrigerator. The latest models (e.g., IBM z16 single-frame system with a standard 19” rack) are built to easily integrate with other IT infrastructure and systems in a modern data center, whether that means in an on-premises data center at a company’s physical location or in a cloud data center.
Designed in 1937, the Harvard Mark I, or IBM Automatic Sequence Controlled Calculator (link resides outside ibm.com), holds the distinction as the first mainframe computer. The US Navy Bureau of Ships used this machine during the last part of World War II for military purposes to quickly solve math problems.
In 1951, the Eckert-Mauchly Computer Corporation (EMCC) began building the first commercial mainframe, UNIVAC. Soon after, in 1953, IBM introduced its first mainframe designed for commercial business use—the IBM Model 701 Electronic Data Processing Machine. The company’s first electronic computer, the 701 was about 25 times to 50 times faster than its predecessors, with rapid advancements in computing power, memory capacity, and smaller size.
Other US manufacturers of large-scale commercial computers during the 1950s included Burroughs, Datamatic, GE, RCA and Philco.
The first modern mainframe, the IBM System/360, hit the market in 1964. Within two years, the System/360 dominated the mainframe computer market as the industry standard. Prior to this machine, software had to be custom-written for each new machine, and there were no commercial software companies. The System/360 separated software from hardware, and for the first time, software written for one machine could run on any other machine in the line.
While many associate virtualization with cloud computing, commercial virtualization technology started on the mainframe as a way to logically divide system resources to be shared among a large group of users. Before virtualization, mainframe IT professionals used keypunches, batch jobs and a single OS to carry out IT operations. In 1964, IBM launched the CP/CMS. This lightweight single-user operating system contained the first hypervisor that created virtual machines (VMs), which virtualized the underlying hardware—increasing efficiency and reducing costs.
Introduced in 1970, the IBM System/370 marked IBM’s first departure from magnetic iron ferrite core technology to silicon memory chips to store data and instructions since they produced faster operating speeds and required much less space. Six months after System/370 launched, the phrase “Silicon Valley” first appeared in print in an issue of Electronic News.
Other significant manufacturers in the mainframe market during the 70s and 80s include Fujitsu, Hewlett-Packard, Hitachi, Honeywell, RCA, Siemens and Sperry Univac. During this time, the mainframe industry continued to advance with smaller machines, I/O performance improvements, more significant memory and multiple processors, allowing their functionality and capacity to grow.
In the 1990s, as the use of the personal computer and other technologies accelerated, some analysts predicted the end of the mainframe. In 1991, InfoWorld analyst Stewart Alsop famously said, “I predict that the last mainframe will be unplugged on March 15, 1996.”
Yet the mainframe use survives as a core IT infrastructure across industries. In April 2022, IBM unveiled the latest generation of the IBM zSeries—the z16, featuring the IBM Telum™ processor with industry-first, on-chip integrated accelerators to predict and automate with AI at unprecedented speed and scale (and with extremely low latency).
Early mainframes like the S/360 had a single processor (or central processing unit (CPU)), while today’s mainframes have a central processor complex (CPC) consisting of specialty processors designed for specific purposes.
The modern mainframe contains network, crypto, storage and compression cards with their own processors and memory. It also houses system assist processors (SAP) that speed up data transfer between the operating system and the I/O (input/output devices) and processors for running Linux™, Java™ and other workloads. This setup allows the mainframe to deliver peak utilization continuously while handling high throughput volumes.
The large number of processors in mainframe technology support businesses across industries (e.g., government agencies, utility companies, financial institutions, healthcare organizations) that rely on large-scale transaction processing to handle massive data workloads, high-volume financial transactions and more. Today’s mainframe solutions are also designed to support cloud computing, data management, big data and analytics, artificial intelligence (AI) and quantum computing, with extensions and integration layers that integrate with core systems.
The longstanding value associated with mainframes centers on reliability, availability and serviceability (RAS).
The mainframe’s hardware has extensive self-checking and recovery in place.
The system can recover from a failed component without impacting the rest of the system. Today’s mainframes provide continuous high availability and rapid disaster recovery to protect against downtime.
The mainframe can determine why a failure occurred. This capability allows for replacing hardware and software elements while impacting as little of the operational system as possible.
Modern mainframe computers also deliver the following unique benefits.
Run standard operating systems like Linux, specialized operating systems, and software that utilize unique hardware capabilities.
Support massive simultaneous transactions, data processing and throughput (I/O) with built-in capacity on demand and built-in shared memory for direct application communication.
Deliver the highest levels of security with built-in cryptographic cards and innovative software that leverages artificial intelligence and machine learning (ML) solutions to help detect cyberattacks or fraud. For instance, modern mainframes can execute up to 1 trillion secure web transactions daily and manage privacy by policy.
Offer resiliency through multiple layers of redundancy for every component (power supplies, cooling, backup batteries, CPUs, I/O components, cryptography modules) and testing for extreme weather conditions.
A mainframe acts as a server for storing and processing data at high speeds and can carry out millions of instructions simultaneously. In contrast, supercomputers are much faster, capable of executing billions of floating-point operations in one second. Supercomputers can perform massive calculation-intensive work for weather forecasting, climate research, molecular modeling, physical simulations and more.
Today’s organizations embrace cloud and distributed architectures that support digital innovation to create a competitive advantage. Cloud-based environments are not a replacement for mainframes.
Instead, the two systems have merged to form a holistic digital transformation strategy. To that end, modernizing mainframe-based applications has become an essential part of today’s enterprise hybrid cloud approach, which combines and unifies on-premises, public cloud, private cloud and edge settings to create a single, flexible IT infrastructure.
By integrating and extending mainframe capabilities into a hybrid cloud environment, businesses can choose the best environment for their workloads (whether in the cloud or on-premises), to maximize each platform’s innovations, technical advancements, security and resiliency. For instance, an airline company can create an application for customers to manage their travel information, such as cloud-based reservation information. The service can also access data held on mainframes, such as changes in flight arrival and departure times. This service doesn’t replace or improve existing mainframe capabilities. Instead, it allows the customer to draw from the best of both worlds: data residing in the cloud and on-premises.
Application modernization—the process of transforming monolithic legacy applications into cloud applications built on microservices architecture—starts with assessing current legacy applications, data and infrastructure and applying the right modernization strategy to achieve the desired result. While it’s possible to lift and shift applications, restructuring the application to take advantage of cloud-native technologies (e.g., containers and Kubernetes) can often deliver more business value.
Today, mainframe application modernization works hand in hand with DevOps, the set of processes and practices for automating the work of software development and IT operations teams. In a recent IDC survey (link resides outside ibm.com), 82% of organizations reported taking steps to prioritize using the same application development tools across mainframe and cloud-native environments. Development teams are leveraging DevOps and DevSecOps practices and delivering applications through automated and integrated pipelines for faster, more agile delivery of software releases and updates across targeted hybrid cloud environments.
Sustainability has become a business imperative. With data centers accounting for approximately 1% of global energy consumption (link resides outside ibm.com), large organizations are looking for ways to reduce IT energy usage as part of their ESG initiatives. Modern mainframes use less energy and have a reduced real estate footprint, helping to improve data center efficiency.
In a report from Allied Market Research (link resides outside ibm.com), the global mainframe market was valued at USD 2.9 million in 2022 and is projected to reach USD 5.6 billion by 2032, growing at a CAGR of 7.3%.
Here are just a few examples of industries that depend on mainframes.
Banks must process large volumes of transactions—from credit card transactions to ATM withdrawals to online account updates. Mainframes provide the data processing power to deliver these services at scale. Next-generation transactions and technologies like blockchain rely on mainframes for the speed, scale and security levels they provide.
The most prominent insurance companies worldwide use mainframes to securely handle massive amounts of sensitive data like patients’ personally identifiable information (PII), medical records and billing information.
Many critical government services, from law enforcement to national security, rely on mainframe systems for the best mix of security, performance and resilience. Government agencies like law enforcement need resiliency against system failures or security breaches. The modern mainframe platform uses AI to gain value from data more quickly and increase cybersecurity efficiency.
Online retailers depend on mainframe systems for enormous processing power that supports transactions across mobile and other devices at scale.
IBM z16™ is the latest iteration of IBM Z® mainframes with on-chip AI inferencing and industry-first quantum-safe technologies.
IBM z/OS® is an operating system (OS) for IBM Z® mainframes, suitable for continuous, high-volume operation with high security and stability.
IBM LinuxONE is an enterprise-grade Linux® server that brings together IBM’s experience in building enterprise systems with the openness of the Linux operating system.
System software designed for hybrid cloud – with the security, resiliency, AI, and application modernization you need.
Discover technology services that help you plan, deploy, optimize and refresh your hybrid cloud and data center infrastructure.
This guide provides a high-level overview of IBM’s strategy to help you modernize applications faster, at lower cost and risk, using IBM® zSystems and public cloud solutions.
Achieving a modernized business is easier than ever before with the IBM Z and Cloud Modernization Center.
Supercomputing is a form of high-performance computing that determines or calculates using a powerful computer, reducing overall time to solution.
Artificial intelligence, or AI, is technology that enables computers and machines to simulate human intelligence and problem-solving capabilities.
Digital transformation takes a customer-driven, digital-first approach to all aspects of a business, including its business models, customer experiences, processes and operations.
Virtualization enables the hardware resources of a single computer—processors, memory, storage and more—to be divided into multiple virtual computers, called virtual machines (VMs).