Can DevOps be cognitive? Meet Application Delivery Intelligence

Share this post:

A core value of DevOps is for teams to continuously improve how they develop and deliver software. This is done by engaging the entire team in retrospectives at the end of each sprint so they can improve how they work.

At IBM, we have a history of applying analytics and now cognitive computing to improve the way we do things. Recently, one of my development teams wondered if there is a means of applying analytics and cognitive computing to improve a DevOps process? Out came the idea for a new product, Application Delivery Intelligence (ADI).

Throughout the DevOps lifecycle we produce a lot of data, including code, designs, test cases, test execution records, and operational information about applications. Application Delivery Intelligence is about analyzing this information to guide us in how to work more effectively. Our intent is to incrementally deliver more capabilities, allowing teams to work smarter. While our focus is centered on DevOps, a lot of this Application Delivery Intelligence will apply to companies using other approaches to development and delivery.

One of the first focus areas for the Application Delivery Intelligence team has been what we call Test Optimization. As companies adopt a DevOps style of development, it means that they need to shift testing left.

A challenge that teams are facing with this shift is the cost of doing more frequent testing. Common questions are:

  • How can you afford the labor and MIPS cost associated with frequent regression testing by focusing on the most relevant tests?
  • How do you know whether you have quality exposure in the test effort?
  • Are you trending in the right direction?

Learn more about improving code quality and how ADI works to make your test optimization work smarter.

We think the sky is the limit for ADI. What problems can we solve by applying analytics and the power of Watson to optimize how we develop software on a mainframe?

Tell us what big problems YOU think we should focus on solving–post a comment below.

More DevOps stories

Transforming IT by delivering infrastructure agility at scale

Modern data platforms, Multicloud, Workload & resource optimization

From time to time, we invite industry thought leaders to share their opinions and insights on current technology trends to the IBM Systems IT Infrastructure blog. The opinions in these posts are their own, and do not necessarily reflect the views of IBM. If you’re a startup developing a new application to test, iterate and more

What is the ROI of IBM Power Systems for SAP HANA?

Power servers, Power Systems, Workload & resource optimization

Infrastructure plays a critical role in the success of SAP HANA deployments. Organizations deploy SAP HANA applications to streamline business processes and generate real-time insights. However, exploiting these capabilities place massive scalability and availability demands on the IT infrastructure. These demands need to be met in an environment that constantly changes with the business needs. more

For enterprise AI, horsepower changes everything

AI, Deep learning, Workload & resource optimization

This blog post is sponsored by IBM. The author, Peter Rutten, is a Research Director for the IDC Enterprise Infrastructure Practice, focusing on high-end, accelerated, and heterogeneous infrastructure and their use cases. Information and opinions contained in this article are his own, based on research conducted for IDC. An IDC survey that I performed in more