Deploying Decision Optimization models programmatically
With IBM Watson Machine Learning, you can deploy your Decision Optimization prescriptive model and associated common data once and then submit job requests to this deployment with only the related transactional data. This deployment can be achieved by using the Watson Machine Learning REST API, watson.ai Python client, or the IBM Cloud Pak for Data Command Line Interface.
See REST API example (Decision Optimization) for a code example, and Python client examples (Decision Optimization) for a Python notebook example.
Overview
The steps to deploy and submit jobs for a Decision Optimization model are as follows. These steps are detailed in later sections.
- Authenticate and create a space. See REST API example.
- Deploy your model with common data. This deployment can be done from the user interface (see Deploying from the user interface) or by following the steps that are described in Model deployment. See also this REST API example.
- Create and monitor jobs to this deployed model.
In the following flowchart, you can see how to deploy and use a model in the context of the whole
Decision Optimization model lifecycle. After you develop a model, by formulating, testing,
and validating it, you can create a deployment model in Watson Machine Learning. To create a deployment model, you must create a
Watson Machine Learning instance. You can then upload the model and
files with data that you want to be reused in each job. To deploy your model you must select the
software specification, maximum number of nodes and the T-shirt size that you want to use. After
your model is deployed, you can then use it with data. First you must upload inline data or specify
its location in a database or in storage. Then you can submit jobs, poll results and delete unwanted
jobs. The jobs are run asynchronously.
The T-shirt size refers to predefined deployment configurations: small, medium, large, and extra large.
| Definition | Name | Description |
|---|---|---|
| 2 vCPU and 8 GB | S | Small |
| 4 vCPU and 16 GB | M | Medium |
| 8 vCPU and 32 GB | L | Large |
| 16 vCPU and 64 GB | XL | Extra Large |