AI governance use case

To drive responsible, transparent, and explainable AI workflows, your enterprise needs an integrated system for tracking, monitoring, and retraining AI models. Cloud Pak for Data provides the processes and technologies to enable your enterprise to monitor, maintain, automate, and govern machine learning and AI models in production.

Watch this video to see the data fabric use case for implementing a AI governance solution in Cloud Pak for Data.

This video provides a visual method as an alternative to following the written steps in this documentation.

Challenges

Establishing AI governance solutions for enterprises involves tackling these challenges:

Ensuring model governance and compliance
Organizations need to track and document the detailed history of models to ensure compliance and to provide visibility to all stakeholders.
Managing risk and ensuring responsible AI
Organizations need to monitor models in production to ensure that the models are valid and accurate, and that they are not introducing bias or drifting away from the intended goals.
Operationalizing the model lifecycle
Organizations need to implement repeatable processes to efficiently retrain and deploy models to production environments.

You can solve these challenges by implementing an AI governance lifecycle with data fabric on Cloud Pak for Data.


Example: Golden Bank's challenges

Follow the story of Golden Bank as it implements an AI governance process to ensure that its new online application process is compliant and explainable. Business analysts at Golden Bank need to review model information to ensure compliance, certify model progress from development to production, and generate reports to share or archive.

Process

To implement AI governance for your enterprise, your organization can follow this process:

  1. Track models
  2. Monitor models
  3. Automate the AI lifecycle

The Watson Studio, Watson Machine Learning, Watson OpenScale, Watson Knowledge Catalog, IBM Watson Pipelines, AI Factsheets, and IBM OpenPages services in Cloud Pak for Data provide all of the tools and processes that your organization needs to implement a AI governance solution.

Image showing the flow of the AI governance use case

1. Track models

Your team can track your machine-learning models from request to production and evaluate whether the models comply with your organization's regulations and requirements.

What you can use What you can do Best to use when
Factsheets In the model inventory in a catalog in Watson Knowledge Catalog, create a use case for a new model.

View lifecycle status for all of the registered assets and drill down to detailed factsheets for models or deployments that are registered to the model use case.

View general model details, training information and metrics, and input and output schema.

View general deployment details, evaluation details, quality metrics, fairness details, and drift details.
You need to request a new model from your data science team.

You want to make sure that your model is compliant and performing as expected.

You want to determine whether you need to update a model based on tracking data.

You want to run reports on a model to share or preserve details.
IBM OpenPages Identify, manage, monitor, and report on risk and regulatory compliance. You want an integrated approach to gathering and reporting model facts.

Example: Golden Bank's model tracking

Business analysts at Golden Bank request a "Mortgage Approval Model". They can then track the model through all stages of the AI lifecycle as data scientists build and train the model and ModelOps engineers deploy and evaluate it. Factsheets document details about the model history and generate metrics that show its performance.


2. Monitor deployed models

After models are deployed, it is important to govern and monitor them to make sure that they are explainable and transparent. Data scientists must be able to explain how the models arrive at certain predictions so that they can determine whether the predictions have any implicit or explicit bias. In addition, it's a best practice to watch for model performance and data consistency issues during the lifecycle of the model.

What you can use What you can do Best to use when
Watson OpenScale Monitor model fairness issues across multiple features.

Monitor model performance and data consistency over time.

Explain how the model arrived at certain predictions with weighted factors.

Maintain and report on model governance and lifecycle across your organization.
You have features that are protected or that might contribute to prediction fairness.

You want to trace model performance and data consistencies over time.

You want to know why the model gives certain predictions.

Example: Golden Bank's model monitoring

Data scientists at Golden Bank use Watson OpenScale to monitor the deployed "Mortgage Approval Model" to ensure that it is accurate and treating all Golden Bank mortgage applicants fairly. They run a notebook to set up monitors for the model and then tweak the configuration by using the Watson OpenScale user interface. Using metrics from the Watson OpenScale quality monitor and fairness monitor, the data scientists determine how well the model predicts outcomes and if it produces any biased outcomes. They also get insights for how the model comes to decisions so that the decisions can be explained to the mortgage applicants.


3. Automate the ML lifecycle

Your team can automate and simplify the MLOps and AI lifecycle with Watson Pipelines.

What you can use What you can do Best to use when
Watson Pipelines Use pipelines to create repeatable and scheduled flows that automate machine learning pipelines, from data ingestion to model training, testing, and deployment. You want to automate some or all of the steps in an MLOps flow.

Example: Golden Bank's automated ML lifecycle

The data scientists at Golden Bank can use pipelines to automate their complete AI governance lifecycle and processes to simplify the model retraining process.

Tutorials for AI governance

Tutorial Description Expertise for tutorial
Build and deploy a model tutorial Train a model, promote it to a deployment space, and deploy the model. Run a notebook.
Test and validate a model tutorial Evaluate a model for accuracy, fairness, and explainability. Run a notebook, and view results in the user interface.

Learn more