Tackling technical risk in database migrations

Share this post:

Technical risk is one of the most common migration worries organizations face, and the potential technical risks vary depending on the type of migration.

So far, this series has covered custom code application migrations as well as migrating ISV applications (see part 1 and part 2). Now I want to discuss technical risk for database migrations, with input from my IBM Systems Lab Services Migration Factory colleagues Rick Murphy and Mark Short.

Migrating databases

To get started, there are two guiding principles that reduce risk:

  1. Never touch the source database: We never modify the source database during a migration. During the go-live weekend, we shut down the source applications and start the cut-over to the new target. If a problem arises, we simply stop the migration and restart the production database on the source system. This gives the client a solid feeling that a back-out plan will work if needed.
  2. Isolate the migration from the client’s day-to-day operations: We use copies of the production instances, called staging servers, for most of our test migrations. This isolates the migration from the client’s day-to-day operations—so no outages are required on the client’s production environment and there is no chance of the migration process bringing down the client’s business.

Most database migrations are relatively low-risk projects, but clients will always worry about losing and/or corrupting valuable data during the migration process. We will talk a bit later about how we guard against that happening, but first we want make some general comments about database migrations.

We’ll be talking about cross-platform migrations in this blog post. That’s where the hardware vendor and operating system change. It is also possible to change the database management system during cross-platform migrations. For example:

Source Database Source Platform Target Database Target Platform Comments


Oracle Sparc/Solaris Oracle Power/AIX

Source and target platform changed and database remains the same


Oracle x/86/Linux DB2 Power/AIX Source and target platforms changed as does the database


Database upgrades can also be part of a cross-platform migration. They’re not the same type of upgrade you’d do if the source and target platforms didn’t change, even though some clients call this process a migration (that is, migrating from Oracle 10.x to Oracle 12.c).

We use automated scripts to assist with the collection of key database metrics and a variety of IBM and vendor-supplied tools to assist with the migration process. The XenoBridge database migration tool is an IBM asset, and our preferred tool for most migrations except for those where the database is too large and cannot be migrated within the allotted outage window. In those situations, we have several replication tools that can be used.

XenoBridge has many built-in features to ensure data integrity during the migration and provides a validation report when the migration is complete, showing that all of the data from the source database has been successfully migrated to the new target database.

XenoBridge can also migrate from an older database version to a newer version as part of the migration process. For example, migrating directly from Oracle 11 on the source to Oracle 12 on the target. This ability to migrate and update in one step is one of the most attractive features of XenoBridge. Upgrades can affect the application code so some up-front work is required to see if any code changes are needed.

Key considerations

There are a number of metrics to consider when we do a database migration, and we have listed a few of the key ones for your consideration:

  • Source database vendor and version
  • Target database vendor and version
  • Will upgrades be required?
  • Size of the production instances (MB, GB, TB)
  • Available downtime window
  • Location of the source and target systems and connection speed between them
  • Number of databases to be migrated
  • Availability of a test plan

Testing with mock migrations is still key

Database migrations use the same “mock/test” migration process described in my previous blog for ISV migrations. All databases will go through multiple mock migrations, with ample test time in between to thoroughly test the database prior to going into production on the new target systems. It’s unlikely, after the mock migrations, that something will cause the production migration to have problems, but if we suspect a problem, we simply stop the migration process without harming the original source database.

The key points related to database migrations

  • All database migrations, regardless of size, utilize the same basic process.
  • We use automated scripts and tools that reduce risk and cost.
  • Mock migrations will ensure a database has been thoroughly tested before cutover.
  • The migration can be stopped and started over if a problem is suspected without fear of corrupting the original source.

As always, working with consultants who have years of experience doing these migrations is the best way to reduce risk. Reach out to IBM Systems Lab Services Migration Factory if we can help with your next project.

What’s next?

In upcoming blog posts, I’ll talk about some of the other common worries we see—worries about how much a migration will cost and how long it will take, as well as how it will affect the company’s operations and employee skills. Please stay tuned!

Thanks to Rick Murphy and Mark Short for their contributions to this article.

Senior Technical Solutions Manager, IBM System Lab Services Migration Factory

More Services stories

IBM Storage innovation from data creation to archive

“Today, enterprises must innovate quickly and constantly–or risk losing competitive advantage,” states Senior Analyst Mark Peters of Enterprise Strategy Group (ESG). IBM lives this observation, constantly innovating at breakneck speed to deliver the best possible solutions to the market. To further our commitment to our customers, today IBM is announcing several new solutions in the […]

Continue reading

AI in action: Autonomous vehicles

Autonomous vehicles will transform our daily lives and our communities. What seemed like science fiction a decade ago is now visible as test vehicles gather data, tune sensors and develop the artificial intelligence (AI) to make cars self-driving and safer. Every major auto company, their suppliers and startups across the globe are using the latest […]

Continue reading

IBM and NVIDIA further collaboration to advance open source GPU acceleration

Plan to bring capabilities to data scientists everywhere By 2020, we anticipate that the world’s volume of digital data will exceed 44 zettabytes, an astounding number. As enterprises begin to realize the vast, untapped potential of data, they need to find a way to exploit it. Enter AI. IBM has worked to build the industry’s […]

Continue reading