March 30, 2021 By Julianna Delua 4 min read

As an increasing number of organizations drive AI-powered digital transformation, several key trends in operationalizing AI are emerging. Growth leaders are separating themselves from growth laggards by using AI and machine learning (ML) in modern application development. Below are some statistics provided by 451 Research:

  • Leaders invest in models for digital transformation: More than half the digital transformation leaders adopted ML compared to less than 25 percent of laggards. Furthermore, 62 percent of enterprises are developing their own models.
  • Prevalence of DevOps increases the demand for automation: 94 percent of enterprise companies have now adopted DevOps. Models are becoming integral to the development of enterprise apps—requiring continuous, synchronized and automated development and deployment lifecycles.
  • Data science and DevOps/app teams collaborate more: In 33 percent of enterprises, the data science/data analytics team is the primary DevOps stakeholder.

An increasing number of application developers are becoming interested in data science and AI, and many have already learned the fundamentals of data science. Business executives are keen on embedding prediction into business, optimizing operations and using automation to augment human capital while enabling their employees to do more with less. However, deploying models into operational systems is a well-known barrier to success. One investment area of interest for tackling operations is to align the cadences of getting a model in production (ModelOps) and an app in production (DevOps).

Intelligent automation can play a pivotal role in aligning model and app cadences. Since we established that AutoAI helps beginners and expert data scientists streamline model development, I’d like to discuss how AutoAI increases yields for model and app investments and orchestrates ModelOps with DevOps.

Automated AI lifecycle helps your models produce better outcomes with efficiency and repeatability

AI development has a full lifecycle that starts at ideation and ends with the monitoring of models in production. Lifecycle stages include data exploration and preparation, model development and deployment, and optimization and monitoring with a feedback loop. Data scientists, business analysts, data engineers and subject matter experts are key players in this lifecycle. What’s new is that DevOps teams are playing a larger role. In particular, growth leaders are now feeding the models produced from this lifecycle to DevOps to drive greater results at scale.

AutoAI was designed to reduce the more tedious, repetitive, and time-consuming aspects of data science and automate them so that data scientists can concentrate on the parts of the lifecycle where they can make the most innovative contributions. AutoAI also helps those who are just starting out with data science to build models quickly and easily. Beginners can also examine how the models are built and the pipelines are generated. Together, businesses can demonstrate better outcomes with fine-tuned prediction, optimization and automation.

Continuously-optimized models are better suited for collaborating with DevOps

In the application lifecycle, an app is born from an idea. After that, the development and design teams work with stakeholders to characterize the day and life of an end user and determine how to help him or her solve problems and achieve better results. Once this vision is in place, an app then moves into analysis, design and prototyping as the development team explores how it should work. After that, there is coding and unit testing, user and system testing, publishing and deployment. There will be periodic updates, adjustments for changes in the business and opportunities to address user feedback. AI and ML models can take dynamic interactions into account and present targeted offers that are tailored to each user.

Automation is already making an impact on the application lifecycle through continuous integration, low-code and no-code app development and more. Seasoned application developers can focus on designing innovative solutions without the tedium of hand-coding or struggling to integrate an app with operations, and beginners can design and prototype quickly—without a lot of coding experience. What’s needed is to find a way to integrate AI models into these automated, continuous-integration streams without disrupting them.

Synchronizing ModelOps and DevOps opens up new opportunities

Undoubtedly there is a strong business case for investing to align models and apps. Data scientists use ModelOps. Developers use DevOps. How can the two be synchronized?

ModelOps is where data science meets production IT, and where business value is created. Establishing ModelOps can make the injection of models into apps a more-tuned, repeatable and successful process.  Models have traditionally been deployed in a one-off fashion, and data scientists and data engineers often lack the skills to operationalize models. Application integration, model monitoring and tuning and workflow automation can be afterthoughts. This is why it makes sense to bring model and app development together on a data and AI platform where collective assets and intelligence can be harnessed.

Automation brings data, models and apps together while unleashing data and app talent

Powered by AutoAI, IBM Cloud Pak for Data is ideal for implementing and integrating ModelOps and DevOps. It enables models to be pushed from a data science team to the DevOps team in a regular deployment and update cycle, aligned with continuous integration and deployment to suit business needs. Powered by Watson Studio, Watson Machine Learning and Watson OpenScale, and open by design, Cloud Pak for Data integrates with cloud-native apps and allows you to build and scale AI to promote explainable AI.

AutoAI facilitates collaboration between the data science team and DevOps and app developers, and it reduces the complexity of deploying and optimizing models in production. If you’re in DevOps and app development, you can take the REST API endpoint from Watson Machine Learning and deploy the model while getting increased visibility into usage statistics, model status, and KPIs. Developers can set up the API connection to send more information for scoring and prediction into apps.

More ways to learn about AI and put AutoAI to work

This is just one example of how your enterprise can employ AutoAI to accelerate growth with data science and AI. Discover additional ways in our “10 ways to use AutoAI” eBook. You can also get many tips by viewing on-demand webinars in our 3-part Winning with AI series.

Be sure to see our webinar Automated AI lifecycle management and ModelOps for your cloud native applications. It focuses on syncing ModelOps and DevOps and features presenters Matt Aslett of 451 Research and Ruchir Puri, chief scientist of IBM Research.

You can learn more about ModelOps for operationalizing AI and explainable AI for managing and monitoring models. Or, watch the on-demand data science and AI webinar series to dig deeper. Don’t miss our newsletter with two complimentary Gartner research reports and IBM’s point of view on ModelOps.

Was this article helpful?
YesNo

More from Cloud

The history of the central processing unit (CPU)

10 min read - The central processing unit (CPU) is the computer’s brain. It handles the assignment and processing of tasks, in addition to functions that make a computer run. There’s no way to overstate the importance of the CPU to computing. Virtually all computer systems contain, at the least, some type of basic CPU. Regardless of whether they’re used in personal computers (PCs), laptops, tablets, smartphones or even in supercomputers whose output is so strong it must be measured in floating-point operations per…

A clear path to value: Overcome challenges on your FinOps journey 

3 min read - In recent years, cloud adoption services have accelerated, with companies increasingly moving from traditional on-premises hosting to public cloud solutions. However, the rise of hybrid and multi-cloud patterns has led to challenges in optimizing value and controlling cloud expenditure, resulting in a shift from capital to operational expenses.   According to a Gartner report, cloud operational expenses are expected to surpass traditional IT spending, reflecting the ongoing transformation in expenditure patterns by 2025. FinOps is an evolving cloud financial management discipline…

IBM Power8 end of service: What are my options?

3 min read - IBM Power8® generation of IBM Power Systems was introduced ten years ago and it is now time to retire that generation. The end-of-service (EoS) support for the entire IBM Power8 server line is scheduled for this year, commencing in March 2024 and concluding in October 2024. EoS dates vary by model: 31 March 2024: maintenance expires for Power Systems S812LC, S822, S822L, 822LC, 824 and 824L. 31 May 2024: maintenance expires for Power Systems S812L, S814 and 822LC. 31 October…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters