Power servers

Tackling technical risk in custom code application migration

Share this post:

It’s not exactly newsworthy that most companies don’t like migrations. They’re viewed as disruptive, risky and costly, and IT teams usually have a laundry list of worries about what might go wrong.

Technical risk is probably the most common migration worry, and the potential technical issues that can arise vary depending on the type of migration. In this post, I’ll discuss migrating custom code applications, and in future posts I’ll cover technical concerns around migrating Independent Software Vendor (ISV) applications and databases.

Custom code applications, young and old

For the purposes of this blog post, I want to start by distinguishing between older custom code applications, often referred to as “legacy,” and young custom code (under 10 years old).

These older environments still play a key role in organizations around the world today. They are usually large, complex applications written 20-plus years ago that continue to do the job they were designed to do. Organizations can still find ways to support these established technologies, but when they decide to make a change, migration usually isn’t the answer. Instead, the answer should be modernization, reengineering, redesign or replacement with an off-the-shelf ISV solution.

Migrations should always be a like-to-like, compatible event. No application changes, no redesign, no architecture changes. If you want, or need, to make those kinds of changes, you should do it before or after the migration, not during it.

When organizations embark on a project to replace one of their older applications, they don’t want to do a like-to-like migration and, in many instances, they can’t for technical reasons. This is an opportunity to modernize, update and replace decades-old technology with new platforms and solutions.

In IBM Systems Lab Services Migration Factory, we see many situations like this every year. Older custom code migrations can be done, but they present so many challenges that other alternatives are usually developed during the assessment process.

Young custom code: Developed for portability

The good news about young custom code migrations is that the technical worries and risks are much lower than the ones we face with older custom code. Why is that?

  • Hardware-agnostic applications are often written in scripting or interpretive languages like Java, PHP, Python, Ruby, Node.js and Perl and should run “as is” on the new platform. For example, Java compiled byte-code is platform-independent if the Java specification has been adhered to (no API or system calls outside of the Java specification). There are multiple Java virtual machines (JVMs) for x/86 platforms. There are also two JVMs available for Linux on IBM Power Systems. One comes with the IBM JDK, the other comes with OpenJDK, and the two have different core JVM technologies. The good news is that they all comply with the Java Compatibility Kit (JCK).
  • Compiled languages like C/C++, COBOL and Fortran will usually only need to be recompiled on the new target and tested. The stumbling blocks for these applications are also well known and include things like dependent libraries, the inclusion of platform-specific compiler flags, assembler code, system calls to the source platform hardware and unique device drivers.
  • There are hundreds of shell scripting tools available like C Shell (csh), Korn Shell (ksh) and Perl Shell (psh) that are all very portable across different platforms. The major differences are documented, and as long as users have not hard coded unique information into a shell script, they are easily migrated. The good news here is that we have tools that can analyze these scripts and point out where there are likely to be issues.
  • For many years, one of the biggest technical worries was migrating an application across endian architectures. Today, the vast majority of applications don’t have this issue. In fact, it’s estimated that less than 5 percent of Linux applications from any platform written in C/C++ will require source code changes, and again, we have tools to analyze your source code and show you where changes will be required.

These young custom code applications as well as current-day applications are developed with portability in mind because it’s not in anyone’s best interest to develop solutions that are limited to a specific technology or platform. Using open source compilers, databases and other open source components to develop these applications goes a long way in dispelling the technical worries about migrating them to a new platform.

Just to give you a few examples of clients who moved to IBM Power Systems from x/86: Livemon, a systems monitoring client in France, took its application, which was highly optimized for x/86, recompiled it on Linux on IBM Power Systems, and got a 2X performance increase right away, with no source code changes required. And recently, a major North American retailer moved a Mongo database from an x/86 platform to a Linux on IBM Power Systems solution and got a 12X performance increase. The move took one business day.

If you’re looking for help with a migration to IBM Power Systems, don’t hesitate to contact IBM Systems Lab Services.

More Power servers stories

Fail fast (and better) in AI development with IBM Watson

AI, IBM Systems Lab Services, Services

Is the fail fast methodology relevant in the context of AI application development? Absolutely. Given that enterprise AI computing is still in its infancy, failing fast with quick turnaround times is critical for the development of AI applications. This entails adopting a flexible end-to-end machine learning pipeline for training workloads at the beginning of the ...read more


AI today: Data, training and inferencing

AI, Deep learning, IBM Systems Lab Services

In my last blog, I discussed artificial intelligence, machine learning and deep learning and some of the terms used when discussing them. Today, I’ll focus on how data, training and inferencing are key aspects to those solutions. The large amounts of data available to organizations today have made possible many AI capabilities that once seemed ...read more


AI, machine learning and deep learning: What’s the difference?

AI, Deep learning, IBM Systems Lab Services

It’s not unusual today to see people talking about artificial intelligence (AI). It’s in the media, popular culture, advertising and more. When I was a kid in the 1980s, AI was depicted in Hollywood movies, but its real-world use was unimaginable given the state of technology at that time. While we don’t have robots or ...read more