February 2, 2021 | Written by: Anup Kalia, Jin Xiao, and Maja Vukovic
Categorized: AI | Hybrid Cloud
Share this post:
Average ‘legacy’ applications written for an earlier operating system or hardware platform that many companies are still using can be huge: think millions of lines of archaic code written and evolved by teams who may have moved on.
More than 80 percent of these old-but-necessary apps remain on physical machines built a long time ago and often with scarce documentation – but which are still critical to the business, as they’re still in use, or need to be kept for records compliance.
To help the developers that update these applications, our team has created Mono2Micro (monolith-to-microservice) – an AI assistant that modernizes legacy applications to help move them to the cloud as microservices. Our tool simplifies and speeds up the often error-prone “application refactoring” process of partitioning and preserving the original semantics of the legacy, monolith applications.
Mono2Micro in action
The AI relies on temporo-spatial machine learning, program analysis and distributed system design to help enterprise developers assess what can be refactored from their legacy applications. For that, Mono2Micro looks at different factors, such as how the code’s implementation reflects an intended business domain function and how different components of the application interact with each other under specific business use cases. It also analyzes what observable code and object state dependencies the developers need handle.
In other words, Mono2Micro grounds the developers’ domain-driven view of service to what is actually happening at the code level – with data-driven decision support on what may be ‘refactor-able’ based on their business, time and resource constraints.
And it works. Take one of our earliest Fortune 20 clients. The company had a large application that was updated and maintained for about a decade, and had been looking into refactoring it. By working with Mono2Micro for a few weeks, the client discovered unknown dependencies and anti-patterns in their application that made the services they were refactoring difficult. The company could not imagine these dependencies existing in their application as they were not intended.
But Mono2Micro showed runtime and static evidence of the code that pointed to certain changes made to the application against its design intent. Our AI helped the client to focus on more business-critical parts of the application and identify legacy components and parallel processes that should be retired entirely. It also helped the company learn how its application was functioning, how it might be able to filter certain portions out as microservices, and how to run these recommendations using Mono2Micro’s code generation.
The power of AI to modernize applications
And Mono2Micro can scale, too. When ran across applications with over dozen thousand classes, we’ve helped clients rapidly arrive at a comprehensible outcome.
The tool is able to examine a large monolith application and recommend candidate microservices by breaking down the monolith code into service partitions, without the need for training. The AI arrives at the recommendation by looking at both the temporal and statistical aspects of an application. Dynamically, Mono2Micro ‘understands’ how segments of codes are related to each other and in what sequence under different business domain functions. And statically, the AI considers all of the programming dependencies that would make refactoring difficult.
This process mimics well what a team of architects and developers would do manually today, but at a much greater acceleration and consistency. As such, Mono2Micro AI’s recommendations are understandable to the application architect and developers, who can make informed decisions and changes to their monolith application to refactor some or all of their application services.
Because of the power of AI, Mono2Micro can consider a wide range of observed application data beyond the limitations of human capacity. This includes temporal relations among application classes expressed in tens or even hundreds of GB of runtime traces, correlating such relations to their business domain functions and data interdependencies. The outcome is a recommendation that developers can fully explore and explain through multi-faceted bits of evidence and correlations, over a galaxy of classes and relations, and at the speed of digital processing.
We’ll continue to improve Mono2Micro together with academic and client partners. Later enhancements may include improved streamlined testing verification and more additional application transformation techniques for enhanced microservices.
To learn more, visit Mono2Micro.
Saurabh Sinha, research staff Member, IBM Research, also contributed to this article.
We are also grateful to the IBM Research team, including Ruchir Puri, Nicholas Fuller, Liana Lin and John Rofrano, for their input on Mono2Micro. And we thank Debasish Banerjee and the IBM Cloud & Cognitive Software development team in partnering to productize IBM Mono2Micro.
Inventing What’s Next.
Stay up to date with the latest announcements, research, and events from IBM Research through our newsletter.