Configuring the Content Assistant service parameters for an object store
You can edit the configuration properties for the Content Assistant service for each object store where you want to use the service.
About this task
- Gen AI Default LLM Model - The property specifies the watsonx.ai large language model (LLM) that is used to answer user queries and generates summaries for your documents.
- Gen AI Embedding Model Name - The property specifies the watsonx.ai Embedding Model that is used to generate the document embeddings that are stored in your vector database.
- Gen AI Index Status - The property specifies whether the Content Assistant index status for an object store is either active or inactive.
- Gen AI Advanced Options - Set advanced Content Assistant options, such as the relevancy score threshold for document chunks that are used to answer a user question.
- Gen AI LLM Query Prompt Template - Set the LLM query prompt template for your object store.
- Gen AI LLM Document Summary Prompt template - Set the LLM document summary prompt template for your object store.
Note: Editing the Content Assistant service parameters is
optional. You may not require changes to these parameters usually. Setting the default values for
these parameters results in good performance for most environments.
The other configuration
properties are either read-only or can be set only after you consult with IBM
Support.