Building agent-driven workflows with the chat API

Use the watsonx.ai chat API with foundation models that support tool calling to build agent-driven applications.

Ways to develop

You can build agent-driven workflows by using these programming methods:

Overview

Agentic applications allow a foundation model to function as an agent that controls the flow of interaction with the user. You define the parameters of the interaction, including the tools that the foundation model can use, but you allow the foundation model to decide the next best step based on the current state of the interaction. Tool calling is sometimes referred to as function calling.

Supported foundation models

When you build an agentic workflow, choose a foundation model that meets the following requirements:

  • Handles chat tasks
  • Supports tool calling
  • Can choose the next action

Before you can use the API, your administrator must install a foundation model that supports function_calling tasks.

To programmatically get a list of foundation models that support tool calling from the chat API, specify the filters=task_function_calling parameter when you submit a List the available foundation models method request.

curl -X GET \
  'https://{cluster_url}/ml/v1/foundation_model_specs?version=2024-10-10&filters=task_function_calling'

REST API

You can specify how foundation models choose a tool to generate a response in your API request as follows:

Use a specific tool
You can specify a tool that the foundation model must use to generate a response with the tool_choice paramater in your API request. Review the following example:
Automatically select a tool
The foundation model can make a decision about which tool to call from a list of provided tools to generate a response when you specify the tool_choice_option parameter in your API request. Different foundation models handle tool-calling in different ways. Review the following examples to see some of the differences:

For details, see the watsonx.ai API reference documentation.

Example tool-calling request that manually specifies a tool

The following example defines a tool that retrieves the current weather. The example submits user input to the foundation model and instructs the model to use only the pre-defined tool to answer the question.

curl -X POST \
  'https://{cluster_url}/ml/v1/text/chat?version=2024-10-08' \
  --header 'Accept: application/json' \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer eyJraWQiOi...'
  --data '{
      "model_id": "mistralai/mistral-large",
      "project_id": "4947c695-a374-428c-acca-332c1a1dc9e9",
      "messages": [
        {
          "role": "user",
          "content": [
            {
              "type": "text",
              "text": "What is the weather like in Boston today?"
            }
          ]
        }
      ],
      "tools": [
        {
          "type": "function",
          "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location.",
            "parameters": {
              "type": "object",
              "properties": {
                "location": {
                  "type": "string"
                  "description": "The city and state, for example, San Francisco, CA",
                },
                "unit": {
                  "type": "string"
                  "enum": [
                    "celsius",
                    "fahrenheit"
                  ]
                }
              },
              "required": [
                "location"
              ]
            }
          }
        }
      ],
      "tool_choice": {
        "type:": "function",
        "function": {
          "name": "get_current_weather"
        }
      },
      "max_tokens": 300,
      "time_limit": 1000
      }'

Example tool-calling request with mistral-large

The following example defines two tools, a function for addition and a function for multiplication. The example submits user input to the foundation model and lets the model pick which tool to use to answer the question.

curl -X POST \
  'https://{cluster_url}/ml/v1/text/chat?version=2024-10-08' \
  --header 'Accept: application/json' \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer eyJraWQiOi...'
  --data '{
      "model_id": "mistralai/mistral-large",
      "project_id": "4947c695-a374-428c-acca-332c1a1dc9e9",
      "messages": [
        {
          "role": "user",
          "content": [
            {
              "type": "text",
              "text": "What is 2 plus 4?"
            }
          ]
        }
      ],
      "tools": [
        {
          "type": "function",
          "function": {
            "name": "add",
            "description": "Adds the values a and b to get a sum.",
            "parameters": {
              "type": "object",
              "properties": {
                "a": {
                  "description": "A number value",
                  "type": "float"
                },
                "b": {
                  "description": "A number value",
                  "type": "float"
                }
              },
              "required": [
                "a",
                "b"
              ]
            }
          }
        },
        {
          "type": "function",
          "function": {
          "name": "multiply",
            "description": "Multiplies the values a and b.",
            "parameters": {
              "type": "object",
              "properties": {
                "a": {
                  "description": "A number value",
                  "type": "float"
                },
                "b": {
                  "description": "A number value",
                  "type": "float"
                }
              },
              "required": [
                "a",
                "b"
              ]
            }
          }
        }
      ],
      "tool_choice_option": "auto",
      "max_tokens": 300,
      "time_limit": 1000
      }'

The sample output shows that the model, mistral-large in this case, is able to choose the correct tool to use for the task, the add function.

{
  "id": "chatcmpl-2f47da4026950db321698cb733b25e89",
  "model_id": "mistralai/mistral-large",
  "model": "mistralai/mistral-large",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "tool_calls": [
          {
            "id": "H6KoCbaZV",
            "type": "function",
            "function": {
              "name": "add",
              "arguments": "{\"a\": 2, \"b\": 4}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1739311926,
  "model_version": "2.0.0",
  "created_at": "2025-02-11T22:12:07.243Z",
  "usage": {
    "completion_tokens": 25,
    "prompt_tokens": 189,
    "total_tokens": 214
  },
  "system": {
    "warnings": [
      {
        "message": "This model is a Non-IBM Product governed by a third-party license that may impose use restrictions and other obligations. By using this model you agree to its terms as identified in the following URL.",
        "id": "disclaimer_warning",
        "more_info": "https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-models.html?context=wx"
      }
    ]
  }
}

Example of a tool-calling request with granite-3-2b-instruct

The following request defines two tools, a function for addition and a function for multiplication, and asks the granite-3-2b-instruct foundation model which tool to use.

The Granite models can call tools better when you provide a system prompt with the request. The following system prompt is used:

You are Granite, developed by IBM. You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request.

curl -X POST \
  'https://{cluster_url}/ml/v1/text/chat?version=2024-10-08' \
  --header 'Accept: application/json' \
  --header 'Content-Type: application/json' \
  --header 'Authorization: Bearer eyJraWQiOi...'

The body of the request contains the following JSON snippet:

{
  "model_id": "ibm/granite-3-2b-instruct",
  "project_id": "4947c695-a374-428c-acca-332c1a1dc9e9",
  "messages": [
    {
      "role":"system",
      "content":[
        {
          "type":"text",
          "text":"You are Granite, developed by IBM. You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request."
      }
      ]
    },
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "What is 2 times 4?"
        }
      ]
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "add",
        "description": "Adds the values a and b to get a sum.",
        "parameters": {
          "type": "object",
          "properties": {
            "a": {
              "description": "A number value",
              "type": "float"
            },
            "b": {
              "description": "A number value",
               "type": "float"
            }
          },
          "required": [
            "a",
            "b"
          ]
        }
      }
    },
    {
      "type": "function",
      "function": {
        "name": "multiply",
        "description": "Multiplies the values a and b.",
        "parameters": {
          "type": "object",
          "properties": {
            "a": {
              "description": "A number value",
              "type": "float"
            },
            "b": {
              "description": "A number value",
               "type": "float"
            }
          },
          "required": [
            "a",
            "b"
          ]
        }
      }
    }
  ],
  "tool_choice_option": "auto",
  "max_tokens": 300,
  "time_limit": 10000
}

The granite-3-2b-instruct foundation model is able to choose the correct tool to answer the query.

The response is as follows:

{
  "id": "chatcmpl-ac80d63a85209b48592435687086e1c2",
  "model_id": "ibm/granite-3-2b-instruct",
  "model": "ibm/granite-3-2b-instruct",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "tool_calls": [
          {
            "id": "chatcmpl-tool-3297286588a24459b0042943cae55629",
            "type": "function",
            "function": {
              "name": "multiply",
              "arguments": "{\"a\": 2, \"b\": 4}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ],
  "created": 1739311794,
  "model_version": "1.1.0",
  "created_at": "2025-02-11T22:09:55.048Z",
  "usage": {
    "completion_tokens": 28,
    "prompt_tokens": 348,
    "total_tokens": 376
  }
}

Python

See the ModelInference class of the watsonx.ai Python library.

To get started, see the following sample notebooks:

Learn more