Reinventing DevOps: Continuous delivery and the database

Share this post:

From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends to the IBM Systems IT Infrastructure blog. The opinions in these blogs are their own, and do not necessarily reflect the views of IBM.

Modern software development teams have adopted a continuous delivery approach based upon DevOps and agile development techniques. The small and frequent code changes that result from such an approach can deliver significant benefit in terms of reduced lead time for changes, a lower failure rate, and a reduced mean time to recovery when errors are encountered. It is no wonder that DevOps and continuous delivery are popular. Today’s development organization migrates more frequent changes into the production environment than ever before.

So far, this all sounds like a good thing, right? And it can be. However, when you add database change into the mix, things get a bit more challenging. The 2017 State of Database DevOps report uncovers database development as lagging behind application development in terms of continuous delivery.

For many DevOps shops, the “Dev” tends to overshadow the “Ops” portion (where Ops includes database operations/DBA). Control of the application delivery process is driven by development, sometimes without the traditional control/oversight of the DBA group. Without the expertise of the DBA the delivery and integration process can fall apart because database change cannot be treated exactly like application change.

Application delivery usually involves copying code and executables from one environment to another. By contrast, the database is an entire configuration in each environment and changes get migrated. Each environment might be different (however so slightly). And differences can “drift” in due to things like optimization changes, emergency changes and so on. Furthermore, consider Db2 packages: access paths determined at bind time in production. The access paths are not (and cannot be) copied from the development environment. Without an experienced DBA to assist, guide and, indeed, administer the process, problems will arise.

Sometimes developers are given free rein to make database changes in the test or development environment. This can encompass simple things like adding a table, column or index; or it could be more complex, requiring an object to be dropped and re-created. Every DBA knows that dropping one database object can have a cascading effect, dropping and revoking other objects and privileges. But most developers are not aware of the wide impact of such changes.

In a test environment, large changes that drop and re-create structures may be tolerable because end users are not impacted, only other developers. But trying to do the same thing in production can cause a significant outage and perhaps even data loss if performed improperly.

When application changes rely on changes to database structures, these changes should be tightly coupled to the application changes that drive them. If the program code expects a new column to be there in order to work properly, then you do not want the new code to be moved to production without the new column… and the opposite is also true. And this should include not only making the changes, but also backing them out when necessary.

More focus is needed on coupling software/database change both at a technology level and at a personnel level. DBA-level expertise is needed on development teams to help guide and implement changes appropriately. Automated software solutions that combine and coordinate application code and database changes are also needed.

Learn more about how to digitally reinvent enterprise IT in this DevOps from APIs to z Systems for dummies book.

Get more pointers from Craig and other DevOps thought leaders in this 10 New Steps to DevOps Success guide.

10 new steps to devops success - click here

President & Principal Consultant, Mullins Consulting, Inc.

Add Comment
No Comments

Leave a Reply

Your email address will not be published.Required fields are marked *

More Servers stories

A forecast of clouds

If the latest IDC FutureScape study – Worldwide Enterprise Infrastructure 2018 Predictions – was about the weather instead of IT trends, without doubt the forecast would be for increasing clouds. Of the ten predictions made in the FutureScape study, nearly half relate to public and private cloud in some way. For example, Prediction #5 says […]

Continue reading

Explore IT infrastructure for digital transformation at Think 2018

Cloud, analytics, blockchain and digital transformation initiatives are generating significant new opportunities for – and demands on – IT infrastructure. IT leaders need to deliver greater performance and scalability than ever to meet these digital-age demands while also providing the security required to protect against internal and external threats. Meeting these demands requires agile and […]

Continue reading

Expressing the future of business

The latest FutureScape study just released from leading analyst firm IDC makes a bold prediction: By 2021, NVMe will replace SCSI as the protocol of choice in enterprise-class storage arrays, and more than 25 percent of spending on all-flash arrays will derive from end-to-end NVMe-based systems.[1] For IBM Storage customers, this prediction from IDC can […]

Continue reading