Tool calling
Overview
Tool calling enables a chat model to respond to a given prompt by invoking an (external) tool. When using watsonx.ai you can use the API or SDKs with supported chat models, and pass a list of tools from which the model will suggest what tool to use to answer your question.
Here are the foundation models that you can use with the chat API in conjunction with tool calling:
ibm/granite-3-8b-instructmeta-llama/llama-3-3-70b-instructmeta-llama/llama-3-2-1b-instructmeta-llama/llama-3-2-3b-instructmeta-llama/llama-3-2-11b-vision-instructmeta-llama/llama-3-2-90b-vision-instructmeta-llama/llama-guard-3-11b-vision-instructmistralai/mistral-large
Support for tool calling will be added to more models over time.
Example
You can generate text by sending a structured list of messages, together with a list of tools.
The following example uses the mistralai/mistral-large foundation model, which works well with text generation and also supports tool calling. The model is given a single tool called add, and a message with the question “What is 2 plus 4?“. The response of the model will be a new message containing the tool that should be called, including the method of calling that tool.
Replace {token}, {watsonx_ai_url}, and {project_id} with your information.
1curl -X POST \ 2-H "Authorization: Bearer {token}" \ 3-H "Content-Type: application/json" \ 4"{watsonx_ai_url}/ml/v1/text/chat?version=2024-05-31" \ 5--data-raw '{ 6 "model_id": "mistralai/mistral-large", 7 "project_id": "{project_id}", 8 "messages": [{ 9 "role": "user", 10 "content": [{ 11 "type": "text", 12 "text": "What is 2 plus 4?" 13 }] 14 }], 15 "tools": [{ 16 "type": "function", 17 "function": { 18 "name": "add", 19 "description": "Adds the values a and b to get a sum.", 20 "parameters": { 21 "type": "object", 22 "properties": { 23 "a": { 24 "description": "A number value", 25 "type": "number" 26 }, 27 "b": { 28 "description": "A number value", 29 "type": "number" 30 } 31 }, 32 "required": [ 33 "a", 34 "b" 35 ] 36 } 37 } 38 }], 39 "tool_choice_option": "auto", 40 "max_tokens": 300, 41 "time_limit": 1000 42 }'
For more information and examples, see the API reference.
The response of the model will look something like the following JSON:
1{ 2 "id": "chat-a00942a130e84f83bc0090c38c2f419f", 3 "model_id": "mistralai/mistral-large", 4 "choices": [ 5 { 6 "index": 0, 7 "message": { 8 "role": "assistant", 9 "tool_calls": [ 10 { 11 "id": "chatcmpl-tool-77cbe4e94d88489383a0c6ed1b644674", 12 "type": "function", 13 "function": { 14 "name": "add", 15 "arguments": "{\"a\": 2, \"b\": 4}" 16 } 17 } 18 ] 19 }, 20 "finish_reason": "tool_calls" 21 } 22 ] 23}
You would then have to call a function called add with the arguments {a: 2, b: 4}. The function returns the sum of the two numbers, which is 6. This value has to be passed back to the model as part of the next message, together with the tool call identifier. The message object should look something like this:
1[ 2 { 3 "role": "user", 4 "content": [ 5 { 6 "type": "text", 7 "text": "What is 2 plus 4?" 8 } 9 ] 10 }, 11 { 12 "role": "assistant", 13 "tool_calls": [ 14 { 15 "id": "chatcmpl-tool-77cbe4e94d88489383a0c6ed1b644674", 16 "type": "function", 17 "function": { 18 "name": "add", 19 "arguments": "{\"a\": 2, \"b\": 4}" 20 } 21 } 22 ] 23 }, 24 { 25 "role": "tool", 26 "tool_call_id": "chatcmpl-tool-77cbe4e94d88489383a0c6ed1b644674", 27 "content": [ 28 { 29 "type": "text", 30 "text": "6" 31 } 32 ] 33 } 34]
The model will respond with a natural language answer that includes the response of the tool call.