Hybrid Cloud

Mono2Micro AI speeds up app ‘refactoring’ before cloud move

Share this post:

Average ‘legacy’ applications written for an earlier operating system or hardware platform that many companies are still using can be huge: think millions of lines of archaic code written and evolved by teams who may have moved on.

More than 80 percent of these old-but-necessary apps remain on physical machines built a long time ago and often with scarce documentation – but which are still critical to the business, as they’re still in use, or need to be kept for records compliance.

To help the developers that update these applications, our team has created Mono2Micro (monolith-to-microservice) – an AI assistant that modernizes legacy applications to help move them to the cloud as microservices. Our tool simplifies and speeds up the often error-prone “application refactoring” process of partitioning and preserving the original semantics of the legacy, monolith applications. 

Mono2Micro in action

The AI relies on temporo-spatial machine learning, program analysis and distributed system design to help enterprise developers assess what can be refactored from their legacy applications. For that, Mono2Micro looks at different factors, such as how the code’s implementation reflects an intended business domain function and how different components of the application interact with each other under specific business use cases. It also analyzes what observable code and object state dependencies the developers need handle.

In other words, Mono2Micro grounds the developers’ domain-driven view of service to what is actually happening at the code level – with data-driven decision support on what may be ‘refactor-able’ based on their business, time and resource constraints.

And it works. Take one of our earliest Fortune 20 clients. The company had a large application that was updated and maintained for about a decade, and had been looking into refactoring it. By working with Mono2Micro for a few weeks, the client discovered unknown dependencies and anti-patterns in their application that made the services they were refactoring difficult. The company could not imagine these dependencies existing in their application as they were not intended.

But Mono2Micro showed runtime and static evidence of the code that pointed to certain changes made to the application against its design intent. Our AI helped the client to focus on more business-critical parts of the application and identify legacy components and parallel processes that should be retired entirely. It also helped the company learn how its application was functioning, how it might be able to filter certain portions out as microservices, and how to run these recommendations using Mono2Micro’s code generation.

The power of AI to modernize applications

And Mono2Micro can scale, too. When ran across applications with over dozen thousand classes, we’ve helped clients rapidly arrive at a comprehensible outcome.

The tool is able to examine a large monolith application and recommend candidate microservices by breaking down the monolith code into service partitions, without the need for training. The AI arrives at the recommendation by looking at both the temporal and statistical aspects of an application. Dynamically, Mono2Micro ‘understands’ how segments of codes are related to each other and in what sequence under different business domain functions. And statically, the AI considers all of the programming dependencies that would make refactoring difficult.

This process mimics well what a team of architects and developers would do manually today, but at a much greater acceleration and consistency. As such, Mono2Micro AI’s recommendations are understandable to the application architect and developers, who can make informed decisions and changes to their monolith application to refactor some or all of their application services.

Because of the power of AI, Mono2Micro can consider a wide range of observed application data beyond the limitations of human capacity. This includes temporal relations among application classes expressed in tens or even hundreds of GB of runtime traces, correlating such relations to their business domain functions and data interdependencies. The outcome is a recommendation that developers can fully explore and explain through multi-faceted bits of evidence and correlations, over a galaxy of classes and relations, and at the speed of digital processing.

We’ll continue to improve Mono2Micro together with academic and client partners. Later enhancements may include improved streamlined testing verification and more additional application transformation techniques for enhanced microservices.

To learn more, visit Mono2Micro.

Saurabh Sinha, research staff Member, IBM Research, also contributed to this article. 

We are also grateful to the IBM Research team, including Ruchir Puri, Nicholas Fuller, Liana Lin and John Rofrano, for their input on Mono2Micro. And we thank Debasish Banerjee and the IBM Cloud & Cognitive Software development team in partnering to productize IBM Mono2Micro.

 

Inventing What’s Next.

Stay up to date with the latest announcements, research, and events from IBM Research through our newsletter.

 

Research Staff Member - Hybrid Cloud & AI, IBM Research

Jin Xiao

Research Staff Member, change automation, live event analytics, ChatOps, app modernization and cloud native, IBM Research

Maja Vukovic

Distinguished Research Staff Member, IBM Research

More Hybrid Cloud stories

Pushing the boundaries of human-AI interaction at IUI 2021

At the 2021 virtual edition of the ACM International Conference on Intelligent User Interfaces (IUI), researchers at IBM will present five full papers, two workshop papers, and two demos.

Continue reading

From HPC Consortium’s success to National Strategic Computing Reserve

Founded in March 2020 just as the pandemic’s wave was starting to wash over the world, the Consortium has brought together 43 members with supercomputing resources. Private and public enterprises, academia, government and technology companies, many of whom are typically rivals. “It is simply unprecedented,” said Dario Gil, Senior Vice President and Director of IBM Research, one of the founding organizations. “The outcomes we’ve achieved, the lessons we’ve learned, and the next steps we have to pursue are all the result of the collective efforts of these Consortium’s community.” The next step? Creating the National Strategic Computing Reserve to help the world be better prepared for future global emergencies.

Continue reading

This ship has no crew and it will transform our understanding of the ocean. Here’s how

IBM is supporting marine research organization ProMare to provide the technologies for the Mayflower Autonomous Ship (MAS). Named after another famous ship from history but very much future focussed, the new Mayflower uses AI and energy from the sun to independently traverse the ocean, gathering vital data to expand our understanding of the factors influencing its health.

Continue reading