Managing the AI Lifecycle with ModelOps
Use the ModelOps tools to manage your AI assets from development to production.
ModelOps explained
MLOps synchronizes cadences between the application and model pipelines. It builds on these practices:
- DevOps for bringing a machine learning model from creation through training, to deployment, and to production.
- ModelOps for managing the lifecycle of a traditional machine learning model, including evaluation and retraining.
MLOps includes not just the routine deployment of machine learning models but also the continuous retraining, automated updating, and synchronized development and deployment of more complex machine learning models. Explore these resources for more details on developing an MLOps strategy:
- Data science and MLOps use case describes how to manage data, model building and deployment, and evaluate model fairness and performance.
- AI Governance use case provides context for how ModelOps can mesh with AI Governance to provide a comprehensive plan for tracking machine learning assets in your organization.
ModelOps tools
Depending on the platform you are using and the services you have enabled, you can design your ModelOps process using a combination of tools to help you manage assets.
- Pipelines for automating the end-to-end flow of a machine learning model through the AI lifecycle.
- AI Governance for creating a centralized repository of factsheets that track the lifecycle of a model, including request, building, deployment, and evaluation of aAI assets
- The cpdctl command-line interface tool for managing and automating your machine learning assets that are hosted on Cloud Pak for Data as a Service by using the cpdctl command-line interface tool. Use automatic configuration from IBM Cloud to easily connect with the cpdctl API commands.
Managing access with deployment spaces
Use deployment spaces to organize and manage access to assets as they move through the AI lifecycle. For example, you can manage access with deployment spaces in the following ways:
- Create a deployment space and assign it to Development as the deployment stage. If you are governing assets, deployments in this type of space display in the Develop stage of a use case. Assign access to the data scientists to create the assets or DevOps users to create deployments.
- Create a deployment space and assign it to Testing as the deployment stage. If you are governing assets, deployments in this type of space display in the Validate stage of a use case. Assign access to the model validators to test the deployments.
- Create a deployment space and assign it to Production as the deployment stage. If you are governing assets, deployments in this type of space display in the Operate stage of a use case. Limit access to this space to ModelOps users who manage the assets that are deployed to a production environment.
Automating ModelOps by using Pipelines
The IBM Orchestration Pipelines editor provides a graphical interface for orchestrating an end-to-end flow of assets from creation through deployment. Assemble and configure a pipeline to create, train, deploy, and update machine learning models and Python scripts. Make your ModelOps process simpler and repeatable.
Tracking models with AI Factsheets
AI Factsheets provides the capabilities for you to track data science models across the organization and store the details in a catalog. View at a glance which models are in production and which need development or validation. Use the governance features to establish processes to manage the communication flow from data scientists to ModelOps administrators.
Evaluating model deployments
Use Watson OpenScale to analyze your AI with trust and transparency and understand how your AI models are involved in decision making. Detect and mitigate bias and drift. Increase the quality and accuracy of your predictions. Explain transactions and perform what-if analysis.
Automate managing assets and lifecycle
You can automate the AI Lifecycle in a notebook by using the watsonx.ai Runtime Python client.
This sample notebook demonstrates how to:
- Download an externally trained scikit-learn model with data set
- Persist an external model in the watsonx.ai Runtime repository
- Deploy a model for online scoring by using the client library
- Score sample records by using the client library
- Update a previously persisted model
- Redeploy a model in-place
- Scale a deployment
Alternatively, you can use IBM Cloud Pak for Data Command-Line Interface (cpd-cli) to manage configuration settings and automate an end-to-end flow. This end-to-end flow includes training a model, saving it, creating a deployment space, and deploying the model.
Typical ModelOps scenario
A typical ModelOps scenario in Cloud Pak for Data might be:
- Organize and curate data assets
- Train a model by using AutoAI
- Save and deploy the model
- Track the model in a use case so that all collaborators can track the progress of the model through the lifecycle and make sure that it complies with organizational standards.
- Evaluate the deployment for bias
- Update the deployment with a better-performing model
- Monitor deployments and jobs across the organization
More resources
- ModelOps Wikipedia article
- Read the ModelOps blog post.
- IBM Blog post on ModelOps about using ModelOps to drive value from your AI investment.
- See how IBM is addressing ModelOps.
Parent topic: Deploying and managing assets