Deploying Decision Optimization models programmatically
You can deploy your Decision Optimization prescriptive model and any associated master data. You can then submit job requests to this deployment and you only need to include the related transactional data. You can deploy your model by using the watsonx.ai Runtime REST API, watsonx.ai Runtime Python client, or the IBM Cloud Pak for Data as a Service Command Line Interface.
See REST API example for a full code example. See Python client examples for a link to a Python notebook available from the Resource hub.
Overview
The steps to deploy and submit jobs for a Decision Optimization model are as follows. These steps are detailed in later sections.
- Create a watsonx.ai Runtime service.
- Create a deployment space by using the https://dataplatform.cloud.ibm.com user interface or with the REST API.
- Deploy your model with any master data that can be reused by all jobs. This deployment can be done from the user interface (see Deploying from the user interface) or by following the steps that are described in Deploying a Decision Optimization model. See also this REST API example.
- Create and monitor jobs to this deployed model.
The T-shirt size refers to predefined deployment configurations: small, medium, large, and extra large.
Definition | Name | Description |
---|---|---|
2 vCPU and 8 GB | S | Small |
4 vCPU and 16 GB | M | Medium |
8 vCPU and 32 GB | L | Large |
16 vCPU and 64 GB | XL | Extra Large |