Requirements for using custom components with ML models
You can define your own transformers, estimators, functions, classes, and tensor operations in models that you deploy in IBM watsonx.ai Runtime as online deployments.
Defining and using custom components
To use custom components with your models, you need to package your custom components in a Python distribution package.
Package requirements
- The package type must be: source distribution (distributions of type Wheel and Egg are not supported)
- The package file format must be:
.zip - Any third-party dependencies for your custom components must be installable by
pipand must be passed to theinstall_requiresargument of thesetupfunction of thesetuptoolslibrary.
Refer to: Creating a source distribution
Storing your custom package
You must take extra steps when you store your trained model in the watsonx.ai Runtime repository:
- Store your custom package in the watsonx.ai Runtime repository (use the
runtimes.store_libraryfunction from the watsonx.ai Python client, or thestore librarieswatsonx.ai Runtime CLI command.) - Create a runtime resource object that references your stored custom package, and then store the runtime resource object in the watsonx.ai Runtime repository (use the
runtimes.storefunction, or thestore runtimescommand.) - When you store your trained model in the watsonx.ai Runtime repository, reference your stored runtime resource in the metadata that is passed to the
store_modelfunction (or thestorecommand.)
Supported frameworks
These frameworks support custom components:
- Scikit-learn
- XGBoost
- Tensorflow
- Python Functions
- Python Scripts
- Decision Optimization
For more information, see Supported frameworks