To most, platform enablement means a platform that is built to be robust and open and one that can make a difference. It means applications can be deployed and accessed more quickly across the entire enterprise. It means getting work done more smartly. Platform enablement supports the cloud. It is the data synthesizer guru, integrating with existing applications and systems, taking in and connecting huge data sets securely, and allowing for multiple protocol engagements. It’s all that and then some.

At the Think 2019 conference James Lang, General Manager of Platform Enablement at Toyota Financial Services, shared his perspective on the digital economy. Toyota Financial Services is a $100 billion finance company that offers Lexus and Toyota auto loans to millions of consumers and dealers. James is responsible for making sure that data and integration is happening to support all of their critical business needs.

“As a financial services company, we’re a fintech company, and so data is our currency going into a digital economy. And so, having our mainframe data and our integrations to AWS work to the best advantage of our business process enables us to be more competitive; it enables us to get quicker to market products and services that ultimately benefit the consumer,” James said.

In your time at Toyota Financial Services, what have you seen as some of the greatest changes?

We’re moving towards being a mobility services company. No longer is it possible to have the $8 million and 18-month projects. We need to deliver value every month. We’ve moved towards an agile factory-based solutioning system where we’re working with the business hand in hand every day to solve their problems.

Is there a way that technology helps you to overcome challenges?

Technology helps us in several different ways. One, in managing our data, in moving away from what we affectionately call our spaghetti of data integration into singular common repositories, one that we’ve built actually in partnership with IBM.

Our Enterprise Digital Transformation Platform (EDTP) allows for accelerated development and deployment of a wide range of agile services and digital transformation solutions that scale. The platform is based on a modern 4-tier evolutionary event-driven architectural style, including AWS Virtual Private Cloud (VPC), containers, microservices, events, streaming, and sync and async processing. 

The mainframe delivers mission-critical applications and data with strong performance, reliability and security, so we need to unlock the mainframe data and resources by bringing them into the EDTP event-driven API ecosystem. The platform exposes mainframe resources and data (CICS, IMS and DB2) as RESTful APIs using IBM z/OS Connect Enterprise Edition, which are then aggregated into high-level business functions that deliver greater value than the participating services.

What are some of the characteristics of a strong partner?

Partnerships are absolutely required. Bringing technology at scale in agile is something that is not easy. And so, having partners that are aware and knowledgeable of doing that has been tremendously helpful for us to be able to move forward quickly. We create a culture of inclusiveness, right from the beginning. And we strive to make sure that we include everybody in all of our planning, in all of the execution activities and, frankly, what I find most important is when you’re successful and you go live, including everybody in the celebration as well. Working with IBM has improved our customer experience because we have truly solved customer experience problems together and have understood what the future of the customer experience will be.

Learn more about IBM Consulting services for AWS Cloud Mainframe application modernization with IBM and AWS
Was this article helpful?
YesNo

More from Cloud

Apache Kafka use cases: Driving innovation across diverse industries

6 min read - Apache Kafka is an open-source, distributed streaming platform that allows developers to build real-time, event-driven applications. With Apache Kafka, developers can build applications that continuously use streaming data records and deliver real-time experiences to users. Whether checking an account balance, streaming Netflix or browsing LinkedIn, today’s users expect near real-time experiences from apps. Apache Kafka’s event-driven architecture was designed to store data and broadcast events in real-time, making it both a message broker and a storage unit that enables real-time…

Primary storage vs. secondary storage: What’s the difference?

6 min read - What is primary storage? Computer memory is prioritized according to how often that memory is required for use in carrying out operating functions. Primary storage is the means of containing primary memory (or main memory), which is the computer’s working memory and major operational component. The main or primary memory is also called “main storage” or “internal memory.” It holds relatively concise amounts of data, which the computer can access as it functions. Because primary memory is so frequently accessed,…

Cloud investments soar as AI advances

3 min read - These days, cloud news often gets overshadowed by anything and everything related to AI. The truth is they go hand-in-hand since many enterprises use cloud computing to deliver AI and generative AI at scale. "Hybrid cloud and AI are two sides of the same coin because it's all about the data," said Ric Lewis, IBM’s SVP of Infrastructure, at Think 2024. To function well, generative AI systems need to access the data that feeds its models wherever it resides. Enter…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters