March 30, 2021 By Julianna Delua 4 min read

As an increasing number of organizations drive AI-powered digital transformation, several key trends in operationalizing AI are emerging. Growth leaders are separating themselves from growth laggards by using AI and machine learning (ML) in modern application development. Below are some statistics provided by 451 Research:

  • Leaders invest in models for digital transformation: More than half the digital transformation leaders adopted ML compared to less than 25 percent of laggards. Furthermore, 62 percent of enterprises are developing their own models.
  • Prevalence of DevOps increases the demand for automation: 94 percent of enterprise companies have now adopted DevOps. Models are becoming integral to the development of enterprise apps—requiring continuous, synchronized and automated development and deployment lifecycles.
  • Data science and DevOps/app teams collaborate more: In 33 percent of enterprises, the data science/data analytics team is the primary DevOps stakeholder.

An increasing number of application developers are becoming interested in data science and AI, and many have already learned the fundamentals of data science. Business executives are keen on embedding prediction into business, optimizing operations and using automation to augment human capital while enabling their employees to do more with less. However, deploying models into operational systems is a well-known barrier to success. One investment area of interest for tackling operations is to align the cadences of getting a model in production (ModelOps) and an app in production (DevOps).

Intelligent automation can play a pivotal role in aligning model and app cadences. Since we established that AutoAI helps beginners and expert data scientists streamline model development, I’d like to discuss how AutoAI increases yields for model and app investments and orchestrates ModelOps with DevOps.

Automated AI lifecycle helps your models produce better outcomes with efficiency and repeatability

AI development has a full lifecycle that starts at ideation and ends with the monitoring of models in production. Lifecycle stages include data exploration and preparation, model development and deployment, and optimization and monitoring with a feedback loop. Data scientists, business analysts, data engineers and subject matter experts are key players in this lifecycle. What’s new is that DevOps teams are playing a larger role. In particular, growth leaders are now feeding the models produced from this lifecycle to DevOps to drive greater results at scale.

AutoAI was designed to reduce the more tedious, repetitive, and time-consuming aspects of data science and automate them so that data scientists can concentrate on the parts of the lifecycle where they can make the most innovative contributions. AutoAI also helps those who are just starting out with data science to build models quickly and easily. Beginners can also examine how the models are built and the pipelines are generated. Together, businesses can demonstrate better outcomes with fine-tuned prediction, optimization and automation.

Continuously-optimized models are better suited for collaborating with DevOps

In the application lifecycle, an app is born from an idea. After that, the development and design teams work with stakeholders to characterize the day and life of an end user and determine how to help him or her solve problems and achieve better results. Once this vision is in place, an app then moves into analysis, design and prototyping as the development team explores how it should work. After that, there is coding and unit testing, user and system testing, publishing and deployment. There will be periodic updates, adjustments for changes in the business and opportunities to address user feedback. AI and ML models can take dynamic interactions into account and present targeted offers that are tailored to each user.

Automation is already making an impact on the application lifecycle through continuous integration, low-code and no-code app development and more. Seasoned application developers can focus on designing innovative solutions without the tedium of hand-coding or struggling to integrate an app with operations, and beginners can design and prototype quickly—without a lot of coding experience. What’s needed is to find a way to integrate AI models into these automated, continuous-integration streams without disrupting them.

Synchronizing ModelOps and DevOps opens up new opportunities

Undoubtedly there is a strong business case for investing to align models and apps. Data scientists use ModelOps. Developers use DevOps. How can the two be synchronized?

ModelOps is where data science meets production IT, and where business value is created. Establishing ModelOps can make the injection of models into apps a more-tuned, repeatable and successful process.  Models have traditionally been deployed in a one-off fashion, and data scientists and data engineers often lack the skills to operationalize models. Application integration, model monitoring and tuning and workflow automation can be afterthoughts. This is why it makes sense to bring model and app development together on a data and AI platform where collective assets and intelligence can be harnessed.

Automation brings data, models and apps together while unleashing data and app talent

Powered by AutoAI, IBM Cloud Pak for Data is ideal for implementing and integrating ModelOps and DevOps. It enables models to be pushed from a data science team to the DevOps team in a regular deployment and update cycle, aligned with continuous integration and deployment to suit business needs. Powered by Watson Studio, Watson Machine Learning and Watson OpenScale, and open by design, Cloud Pak for Data integrates with cloud-native apps and allows you to build and scale AI to promote explainable AI.

AutoAI facilitates collaboration between the data science team and DevOps and app developers, and it reduces the complexity of deploying and optimizing models in production. If you’re in DevOps and app development, you can take the REST API endpoint from Watson Machine Learning and deploy the model while getting increased visibility into usage statistics, model status, and KPIs. Developers can set up the API connection to send more information for scoring and prediction into apps.

More ways to learn about AI and put AutoAI to work

This is just one example of how your enterprise can employ AutoAI to accelerate growth with data science and AI. Discover additional ways in our “10 ways to use AutoAI” eBook. You can also get many tips by viewing on-demand webinars in our 3-part Winning with AI series.

Be sure to see our webinar Automated AI lifecycle management and ModelOps for your cloud native applications. It focuses on syncing ModelOps and DevOps and features presenters Matt Aslett of 451 Research and Ruchir Puri, chief scientist of IBM Research.

You can learn more about ModelOps for operationalizing AI and explainable AI for managing and monitoring models. Or, watch the on-demand data science and AI webinar series to dig deeper. Don’t miss our newsletter with two complimentary Gartner research reports and IBM’s point of view on ModelOps.

Was this article helpful?

More from Cloud

Announcing Dizzion Desktop as a Service for IBM Virtual Private Cloud (VPC)

2 min read - For more than four years, Dizzion and IBM Cloud® have strategically partnered to deliver incredible digital workspace experiences to our clients. We are excited to announce that Dizzion has expanded their Desktop as a Service (DaaS) offering to now support IBM Cloud Virtual Private Cloud (VPC). Powered by Frame, Dizzion’s cloud-native DaaS platform, clients can now deploy their Windows and Linux® virtual desktops and applications on IBM Cloud VPC and enjoy fast, dynamic, infrastructure provisioning and a true consumption-based model.…

Microcontrollers vs. microprocessors: What’s the difference?

6 min read - Microcontroller units (MCUs) and microprocessor units (MPUs) are two kinds of integrated circuits that, while similar in certain ways, are very different in many others. Replacing antiquated multi-component central processing units (CPUs) with separate logic units, these single-chip processors are both extremely valuable in the continued development of computing technology. However, microcontrollers and microprocessors differ significantly in component structure, chip architecture, performance capabilities and application. The key difference between these two units is that microcontrollers combine all the necessary elements…

Seven top central processing unit (CPU) use cases

7 min read - The central processing unit (CPU) is the computer’s brain, assigning and processing tasks and managing essential operational functions. Computers have been so seamlessly integrated with modern life that sometimes we’re not even aware of how many CPUs are in use around the world. It’s a staggering amount—so many CPUs that a conclusive figure can only be approximated. How many CPUs are now in use? It’s been estimated that there may be as many as 200 billion CPU cores (or more)…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters