Cadence is a world leader in computational software, helping companies design the microchips that drive today’s most important emerging technologies: new types of AI, autonomous vehicle systems, and advanced technology to aid people with disabilities, to name just a few examples.
When IBM developed the world’s first microchip nodes at the two-nanometer (nm) scale, it was the latest milestone in the ongoing push to pack more circuitry into tinier and tinier pieces of silicon. The 2nm node holds the promise of revolutionary energy efficiency and huge leaps forward in technology.
But at such a small scale, designing differentiated circuitry onto chips—accomplished by electronic design automation (EDA)—requires incredibly complex processing and unprecedented degrees of precision and computation.
IBM recently deepened our partnership with Cadence to leverage Cadence’s core EDA and systems portfolio, including digital, analog and verification software and 3D IC packaging and system analysis solutions. As a result, we can better deliver innovative solutions to clients in the areas of chiplet and packaging, logic technology, design and enablement on cloud.
“Cadence and IBM have been collaborating on chip design for a number of years, and our close partnership on the silicon development and the recent work with Cadence on IBM cloud reflects the deepening partnership,” said Vandana Mukherjee, Senior Research Manager, Hybrid Cloud and IBM Semiconductors.
IBM is deploying Cadence AI-enabled digital implementation technology to design our leading silicon targeting storage and encryption. The Cadence 3D IC package design tools are a critical element of the system design and enable analysis of both the electrical and thermal issues. Leveraging Cadence’s front-end to back-end solutions, the IBM team is developing methodologies to analyze and verify the overall system function and performance for intelligent system designs.
Enabling EDA workloads in the cloud
Cadence is also one of the early adopters of IBM solutions, having deployed IBM Cloud® for over 3 years to help in in its internal software development and EDA workloads.
“Compute is like oxygen for us,” says Tarak Ray, Cadence CIO. “We have about 10,000 engineers and 5 major in-house data centers where millions of jobs are implemented every month.”
The pressure for more computing power comes from everywhere at once. The market demands more chips every year. The growing incorporation of AI into end products and increasing demand for customization require new sophistication in circuit design. And almost every chip design is in a tight race to market. Cadence builds AI and machine learning (ML) into its EDA processes to help engineers work faster, but those routines need more CPU power, too.
So how does Cadence meet this demand?
Scaling up the company’s data centers isn’t the ideal solution. In addition to space limitations, Ray notes that there are other challenges to consider. “We have to buy the servers, and there is a lead time to install. It’s a month minimum for the servers. For network gear, it could take longer. And our engineers are asking for the compute now,” he says.
The obvious answer is the cloud. But in EDA, it’s not so simple. An EDA cloud needs a new level of agility, allowing massive workloads—involving millions of computations, and data volumes in terabytes or petabytes—to seamlessly shift between on-premises and cloud. It also needs to be very flexible, allowing for deep differentiation from project to project, because the demand for custom chips means that different projects need different types of servers and platforms. And it must be secure. As Cadence Senior Vice President and General Manager of System & Verification Group Paul Cunningham says: “The data being processed by our tools contains some of the most valuable trade secrets of our customers. Security is essential.”
The solution must also deliver excellent computing power per cost. As recently as four years ago, Ray explains, EDA providers had very low utilization of core processing resources due to complications with server configurations and distributing multiple workloads globally. At the CPU volumes required by EDA, unutilized resources add up to significant cost inefficiency. Cadence aimed to boost utilization considerably.
Cadence strategically designed its solution with a blend of on-premises and multicloud-based compute resources, including cloud high-performance computing (HPC) from IBM. “IBM Cloud truly understands hybrid networking and the challenges of the EDA industry,” says Ray. “It provides a choice of bare metal servers and virtual servers; it has storage, data movement and synchronization capabilities, networking, choice of firewalls, and robust virtual private cloud (VPC) options … it’s the whole solution. It gives us additional capacity with minimum to no disruption. If we need to burst a huge compute capacity, we can get it.”
The IBM computing resources help power three workstreams in Cadence’s computing environment: design of chips for Cadence’s own solutions, systems verification services for external customers’ designs, and development of Cadence EDA tools.
Cadence and IBM built the solution around these primary components:
To meet Cadence’s security requirements, Cadence and IBM integrated the solution components with IBM Cloud Activity Tracker and IBM Security and Compliance Center for detection, auditing and reporting around security-related events and compliance.
Ray adds: “IBM has a very good technical team. They know what they are doing, and they are extremely focused on making the customer successful. They understood our vision, and they worked closely with us to design this solution to meet our strategic goals.”
Cunningham explains the impact the solution can have on time to market. “You’re running a program to deliver a new silicon product, and you’re in the crunch time where you need to deliver a certain quality of results, meet verification goals or make sure you’ve done enough testing. Time is of the essence, and you want to burst to the cloud as quickly as possible. Several years ago, there was no way to do this. You had to deal with hurdles either around security or lift and shift or other things. Now, with Cadence OnCloud, we can really do it.”
IBM Cloud HPC is designed to deliver increased storage performance, greater compute power and higher levels of security, and with these capabilities, we’re helping Cadence drive overall efficiency and improve HPC for computational software workload performance.
“IBM Cloud and Cadence have co-created an EDA as a service experience to meet our client’s demands for a frictionless experience with flexible capacity”, said Christopher Rusert, Worldwide Leader for High Performance Computing at IBM. “Through this partnership, we offer clients a flexible path to cloud for EDA application modernization.”
Not only is more compute power more accessible than ever before, but it’s more cost-effective, too. Using the IBM Cloud HPC platform, Cadence surpassed its high utilization goal. Now, Cadence will bring these improvements to the market, using the IBM Cloud HPC solution to back its customer-facing EDA SaaS platform, Cadence OnCloud.
Enter Mahesh Turaga, Cadence’s Vice President of Cloud Business Development. In his role, Turaga has a keen understanding of the market for cloud.
“We are on the cusp of a massive cloud adoption,” says Turaga, adding that Cadence has seen some large companies commit to moving their entire EDA workloads to the cloud over the last few years. Cadence has already seen significant adoption of its Cadence OnCloud portfolio by small and midsize customers that have successfully done hundreds of tapeouts—the compute-intensive process of producing the final photomask of an integrated circuit—in the cloud. He explains that the shift now will be driven in no small part by the emergence of generative AI (gen AI) and large language model (LLM) technologies: “Now, with gen AI and LLM, you are looking at next-generation productivity improvements across the board. The possibilities are endless. I think it’s going to make all of us incredibly productive. It will also drive the need for the infinite elasticity available in the cloud.”
But what about cost? Though the bill for infinite elasticity may be higher than what a company pays for on-premises resources, “It’s comparing apples to oranges,” says Turaga. “Look at it from a business transformation standpoint.” He mentions a Cadence customer that reduced its time to market for its chips by two months by shifting EDA workloads to the cloud. “That’s the business value you need to consider. If you can accelerate your engineers’ productivity by a certain percentage, what does that do to your bottom line? For a typical 5nm project, most of the cost is engineering effort. Even if you make your engineers 10% more productive, you’ll have a huge advantage.”
Cadence Design Systems, Inc. (link resides outside of ibm.com) is a pivotal leader in electronic systems design, building on more than 30 years of computational software expertise. The company applies its underlying Intelligent System Design strategy to deliver software, hardware and IP that turn design concepts into reality. Cadence customers are innovative companies delivering electronic products—from chips to boards to complete systems—for the most dynamic market applications, including hyperscale computing, 5G communications, automotive, mobile, aerospace, consumer, industrial and healthcare. For nine years in a row, Fortune magazine named Cadence one of the 100 Best Companies to work for.
© Copyright IBM Corporation 2024. IBM, the IBM logo, Aspera, IBM Cloud, IBM Spectrum, and LSF are trademarks or registered trademarks of IBM Corp., in the U.S. and/or other countries. This document is current as of the initial date of publication and may be changed by IBM at any time. Not all offerings are available in every country in which IBM operates.
Red Hat and OpenShift are registered trademarks of Red Hat, Inc. or its subsidiaries in the United States and other countries.
Client examples are presented as illustrations of how those clients have used IBM products and the results they may have achieved. Actual performance, cost, savings or other results in other operating environments may vary.