John Duigenan is the new global chief technology officer for Financial Services at IBM. Before being named to the post last November, Duigenan spent years as an IBM distinguished engineer, partnering with some of the largest banks in the world. With the financial services industry now rushing to leverage cloud services in an extraordinary time, Duigenan sat down with Industrious to discuss how banks can overcome today’s most pressing challenges. (Read the first half of this interview, about how technical debt weighs on balance sheets and talent, here.) 

What’s it like helping legacy banks finally, safely, comfortably make the jump to cloud? Risk and regulation are huge here, right?

Duigenan

Our teams that partnered with Bank of America focused on their needs and pain points. We heard exactly what their challenges were and were able to say, “If we can do this, will that work for you? If we can build the cloud services that have compliance controls that you can customize and monitor, would that be useful for you?” Clearly, the answer was yes.

Regulated industries, whether it’s financial services, insurance, pharmaceuticals—they have specific regulatory needs. That’s not recognized by generic cloud providers, who just keep selling and selling cloud like it’s a magic service.

So how do banks ensure they’re meeting regulatory and security standards on their clouds?

Clouds can all be engineered to be safe, but it takes a massive amount of work to do so. The responsibility for doing that is most often placed with the clients. A cloud provider will say, “You can use my nifty encryption services, but as a cloud service consumer, it’s your responsibility to get it right. And if you don’t, it’s your fault.” Companies have discovered this the hard way.

IBM takes a different position. We know you’re a regulated firm. We are the enterprise hybrid cloud provider. So we have created regulatory configurations to ensure you could never deploy an unsecure service where data, for example, could be accessed in an unencrypted form. We create a policy. We ensure that policy is enforced through technical and operational controls. We ensure that it is monitored. If something changes and a configuration falls out of compliance, we automatically fix it and get it back to compliance again.

All of this is captured in audits and reports that could be used for demonstrating to a regulator or an auditor that the cloud configuration is compliant.

You mentioned secure environments. What are the biggest security issues facing the industry?

The first is privacy. Information has been an absolute free-for-all until now, and consumers have had incredibly little choice; they have been completely and totally neglected. You’ll realize that social media companies know more about you than you do. So privacy becomes a huge issue. Banks will need to have controls around the information they keep—which they already do—but not nearly enough. They’ll have to offer privacy choices to their clients. They’ll have to ensure that those choices about privacy are reflected across their entire ecosystem.

The second issue is resiliency. There will be regulation demanding specific resilience, with ways for banks to deploy their applications that minimize risk. That means a demonstrable change in operational policies and practices for how you ran a bank and how you ran your applications. You can just tell that regulators are sharpening their knives even more, not just on existing regulation, but new regulation. The regulatory and cybersecurity sensitivity around cloud is very, very high now because regulators all over the world are acutely focused on these issues.

Banks are obviously doing these things because they matter to consumers, but privacy and security are important to regulators, too. What other regulatory pressures are banks facing?

One emerging theme is concentration risk. In this case, it’s the risk inherent in running applications on just one cloud versus multiple ones. One of the things that banks will have to demonstrate is an exit strategy from a cloud provider. That’s why we think programs like Red Hat Open Shift on our cloud, which allows for easily portable containerization of programs and data, are so important. Or a bank could extend one cloud provider’s services into other cloud providers through IBM Cloud Satellite. With either of these approaches, which have the linkages to easily move from one cloud to another, you’ve immediately minimized concentration risk.

We empower our clients to go faster by removing complexity and building in all the standardization they need, regardless of the cloud that they’re running on.

Want to join a community of industry leaders and tech experts sharing ideas like this? Subscribe now.

Was this article helpful?
YesNo

More from Cloud

Apache Kafka use cases: Driving innovation across diverse industries

6 min read - Apache Kafka is an open-source, distributed streaming platform that allows developers to build real-time, event-driven applications. With Apache Kafka, developers can build applications that continuously use streaming data records and deliver real-time experiences to users. Whether checking an account balance, streaming Netflix or browsing LinkedIn, today’s users expect near real-time experiences from apps. Apache Kafka’s event-driven architecture was designed to store data and broadcast events in real-time, making it both a message broker and a storage unit that enables real-time…

Primary storage vs. secondary storage: What’s the difference?

6 min read - What is primary storage? Computer memory is prioritized according to how often that memory is required for use in carrying out operating functions. Primary storage is the means of containing primary memory (or main memory), which is the computer’s working memory and major operational component. The main or primary memory is also called “main storage” or “internal memory.” It holds relatively concise amounts of data, which the computer can access as it functions. Because primary memory is so frequently accessed,…

Cloud investments soar as AI advances

3 min read - These days, cloud news often gets overshadowed by anything and everything related to AI. The truth is they go hand-in-hand since many enterprises use cloud computing to deliver AI and generative AI at scale. "Hybrid cloud and AI are two sides of the same coin because it's all about the data," said Ric Lewis, IBM’s SVP of Infrastructure, at Think 2024. To function well, generative AI systems need to access the data that feeds its models wherever it resides. Enter…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters