Action Required Service Updates

Automate and Operationalize Your AI with AI OpenScale

Share this post:

Exciting announcements for AI OpenScale

Today, we are taking a major step forward in helping enterprises automate and operationalize AI across its lifecycle with the launch of the Standard Plan for AI OpenScale on IBM Cloud, along with other updates to the offering. In addition, AI OpenScale is now also available for deployment on cloud or on-premises with IBM Cloud Private for Data.

Artificial Intelligence (AI) is a key component of digital transformation across enterprises of all sizes. By 2019, 40% of digital transformation initiatives will use AI services, and by 2021, 75% of enterprise applications will use AI. For enterprises that have successfully adopted AI and integrated it into their business processes, the return on investment is transformational. However, not all enterprises are successful in getting a desirable return on investment from AI because IT and business leaders struggle with operationalizing AI in applications. This is largely due to a lack of business confidence in AI and an inability to support the new processes and skills required to continuously maintain AI performance.

Benefits of AI OpenScale

IBM AI OpenScale is an open environment that enables organizations to automate and operationalize their AI:

  1. Open by design: AI OpenScale allows monitoring and management of ML and DL models in production, built and deployed on any model hosting engine, irrespective of cloud or on-premises deployment. It also supports popular open source ML and DL frameworks. AI OpenScale enables end-end machine learning  along with Watson Studio and Watson Machine Learning—IBM’s premium data science offerings — and can be deployed on IBM Cloud or IBM Cloud Private for Data.
  2. Drive fairer outcomes in production models: AI OpenScale detects biases in the build and runtime data to highlight fairness issues. It provides a plain text explanation of the data ranges which have been impacted by bias in the model, helping data scientists and business users understand the impact on business outcomes. It also shows a graphical view of data distribution in runtime. As biases are detected during inferences, AI OpenScale automatically de-biases the outcomes using a companion model that runs beside your deployed model, thereby previewing the expected fairer outcomes to users without replacing the original model. It also provides a comparison of fairness and accuracy metrics before and after the de-biasing, which can help data scientists and line of business owners make an informed decision about deploying the de-biased model to production.          Bias detection in runtime
              De-biased outcomes in runtime
  3. Explain transactions in production: AI OpenScale helps enterprises bring transparency and auditability to AI-infused applications by generating explanations for individual transactions being scored, including the attributes that were used to make the prediction and weightage of each attribute. This supports enterprises in highly regulated industries, like finance and healthcare, with considerable risk and governance requirements. In addition to generating explanations post-hoc, it also describes the conditions under which the prediction may have changed, thus helping companies upsell to end customers in a customer care scenario or provide a better picture to an auditor.
  4. Automate the creation of AI: Neural Network Synthesis (NeuNetS), available in this update as a beta, synthesizes neural networks by fundamentally architecting a custom design for a given data set. In the beta, NeuNetS will support image and text classification models. NeuNetS reduces the time and lowers the skill barrier required to design and train custom neural networks, thereby putting neural networks within the reach of non-technical subject matter experts, as well as making data scientists more productive. Typically, even experienced data scientists have to spend a long amount of time designing, writing, and tuning neural networks. This includes several runs of training on expensive GPUs. With NeuNetS, users can access the service through Watson Studio, then upload datasets and get a fully trained network within hours, instead of weeks, and at a fraction of the training cost. Once trained, you can deploy your neural network model to Watson Machine Learning, in a single click.
    You can read more about NeuNetS here.
    Upload datasets through an intuitive UI
      NeuNetS displays useful metrics once training is complete

Key features

AI OpenScale provides an intuitive UI and rich programmatic interface to configure, monitor, and manage ML or DL models in production.

The AI Ops Console provides a summary view of all the deployments being managed and allows an AI Ops engineer to figure out which deployments to pay attention to. It shows a quick view of the accuracy and fairness metrics. Accuracy is calculated based on the model performance metrics that are defined by users. These metrics are also accessible through an open data mart for custom reporting on business KPIs, by combining with external application metrics.

Upon clicking into one of the deployments in the AI Ops Console, the AI Ops engineer can look at more details of the model. They can see how many transactions are being scored per minute, their accuracy levels, and the different attributes that are being tracked to measure fairness.

With AI OpenScale, we are excited about the opportunity to help enterprises scale adoption of AI in mission-critical applications, and we look forward to getting your thoughts and feedback.

 

Offering Manager – AI OpenScale

More Action Required Service Updates stories
February 19, 2019

IBM Analytics Engine: Changes to Cluster Credential Access

To make your cluster more secure and to prevent malicious conduct, IBM Analytics Engine will now follow the security best practices and only return the cluster credentials via the reset password API when requested by the user.

Continue reading

January 8, 2019

Sydney? Version 9.1.1? IBM MQ MFT? IBM MQ on Cloud Welcomes You with Open Arms

IBM MQ is now available in Sydney, Australia. Additionally, you can enjoy MQ 9.1.1 in the managed service by upgrading your queue managers manually or waiting until the automatic upgrade on February 6, 2019.

Continue reading

December 18, 2018

IBM Cloud Mobile Foundation: Migration from Deprecated Plans to New Plans

In February 2018, we announced the deprecation of Developer, Developer Pro, and Professional Per Capacity pricing plans for Mobile Foundation Service. If you are still on a deprecated plan, immediate action is required to migrate to one of the new plans.

Continue reading