.---

copyright: years: [{CURRENT_YEAR}] lastupdated: "[{LAST_UPDATED_DATE}]"

subcollection: watson-assistant


Configuring Base LLM

The Base large language model (LLM) section in the Generative AI page helps you to configure large language models for your assistants. The LLMs in AI assistant builder enable your customers to interact with the assistants seamlessly without any custom-built conversational steps. You can enable the Base LLM features for the existing actions in your assistants that improve their conversation capability.

Base LLM Full Screen
Base LLM

You can do the following actions in the Base LLM configuration:

Selecting a large language model for your assistant

To select the LLM that suits your enterprise ecosystem, do the following steps:

  1. Go to Home > Generative AI.

  2. In the Base large language model (LLM) section, select the large language model from the Select a model dropdown. For more information about models, see Supported foundation models for different components.

Adding prompt instructions

You can instruct the LLM in your assistant to give refined responses by adding prompt instructions. The prompt instructions help LLMs to guide the conversations with clarity and specificity to achieve the end goal of an action. Follow these steps to add the prompt instruction:

  1. Go to Home > Generative AI.

  2. In the Add prompt instructions section, click Add instructions to see the Prompt instruction field.

  3. Enter the prompt instructions in the Prompt instruction field.

    The maximum number of characters that you can enter in the Prompt instruction field is 1,000.

Selecting the answering behavior of your assistant

You can configure the answering behavior of your assistant to provide responses that are based on the preloaded content or general content. The answering behaviors that you can configure are:

  • General-purpose answering

    In general-purpose answerings, the LLM gives responses to customer queries based on general topics.

    Important:If you are in the agentic experience of IBM watsonx Orchestrate, general-purpose answering might not be available.
  • Conversational search

To use the conversational search, you must configure search integration and enable conversational search.

Toggling off: Conversational search disables the process that calls it in the routing priority path. You are not disabling the search capability itself.

The following Conversational search behavior applies:

  • If you enable both General-purpose answering and Conversational search, the Conversational search takes precedence over General-purpose answering.

  • If the assistant response scores less in search confidence, the answer behavior uses General-purpose answering.

  • The LLM uses content that is preloaded during the search integration to respond to customer queries.