Changing the Content Assistant large language model for an object store
You can review the Content Assistant large language model for an
object store.
About this task
.If you plan to use the IBM Content Assistant along with a IBM watsonx.ai account that you own, you
can configure the object store to use any of the LLMs supported by IBM watsonx.ai. For more information, see
the topic Supported foundation models
.
.Note: If you are using a custom
application, you can use the FileNet APIs to configure an optional parameter that overrides the LLM
settings. For more information, see the topic Generating custom
AI applications.
Procedure
- In the Administration Console for Content Platform Engine (ACCE), go to the object store where you want to use the Content Assistant.
- In the left-pane, go to Search.
- Click New Object Store Search.
- In Simple View, specify the Class as Gen AI
Configuration and click Run. The search results display a single Gen AI Configuration object.
- From the search results, click the object ID to access the properties for the Gen AI Configuration object.
- In the Properties tab, specify the LLM in Gen AI
Default LLM Model and click Save. If the property is not populated, then the default LLM is used. For the list of available LLM models, see IBM models
.