Skip to main contentwatsonx Developer Hub

Agents

Free token limit increased!
New 300k token limit for all new, free trials to use for LLM API calls and more. Sign up for free here.

Build and deploy your first agent using watsonx.ai

This guide will walk you through building and deploying an AI agent using watsonx.ai. The example project is a research agent that can search the web to summarize research papers using external tools.

Use Agent Templates in watsonx.ai to kickstart your agent development (4:33 min)

1. Visit Developer Access page

To begin using the CLI, you will need 3 values: a project or space id, an endpoint for your region, and an API key. You can visit the Developer Access page to receive these values. Make sure to also follow the link on the Developer Access page to get your IBM Cloud API key separately.

2. Start building

Prerequisites

  • Python 3.11
  • An IBM Cloud account with watsonx.ai access

1: Install the CLI

Create a new directory on your machine and install the CLI by running:

1pip install ibm-watsonx-ai-cli

2: Set up a template

Run the command below to pick one of the supported agent templates:

1watsonx-ai template new

And select any of the templates, for example: community/langgraph-arxiv-research.

4: Set the enviroment variables

Once the template is set up, move into the created directory and copy the config.toml file:

1cd langgraph-arxiv-research
2cp config.toml.example config.toml

In this file you need to add the values for:

  • deployment.watsonx_apikey
  • deployment.watsonx_url
  • deployment.space_id
  • deployment.online.parameters.space_id
  • deployment.online.parameters.url

Go to the Developer Access page to find your environment variables.

Everything is now set up to run the agent locally in the next step.

5. Running the agent locally

You can run the agent locally and interact with the agent via a terminal-based chat application:

1watsonx-ai template invoke "show me the latest arxiv papers on the model context protocol"

This should return a list of arXiv papers regarding the Model Context Protocol.

6. Deploying to IBM Cloud

Deploy the agent by running:

1watsonx-ai service new

You can test your deployment using the command:

1watsonx-ai service invoke 'Hello, how can you help me?'

Or by clicking the link to view the deployed agent in the dashboard.

Troubleshooting

IssueSolution
IBM Cloud API errorsDouble-check config.toml credentials

3. Preview the agent in the dashboard

Once an agent is deployed to watsonx.ai you can find the API endpoint to connect to the agent from a remote location or directly interact with the agent using the built-in chat interface. To use the agent via the chat interface follow these steps:

1. Find your newly deployed agent

In the watsonx.ai dashboard under “Deployments” where you should open the space you just deployed the agent to.

2. Open the space

By clicking on the space (in this case “agent test”) you can find the list of deployments, after running the script from the previous section you should see a new deployment in this list.

List of deployed agents on watsonx.ai

Note: Make sure to check the deployment has the tag wx-agent as this is needed to interact with it via the “Preview” tab.

3. Find the preview tab

After clicking on the latest deployment, you can see the private and public API endpoint of the agent, you can also see a “Preview” tab which you’ll need to open to test the agent in the chat interface.

Deployed agent on watsonx.ai

4. Ask the agent

In the chat interface you can ask the same questions as you did when running the agent locally, for example”: “show me a list or arxiv papers about the model context protocol. This should (again) return a list of papers published on ArXiv that mention Model Context Protocol (MCP). You can also see what tools the agent called to retrieve this information.

Preview a deployed agent on watsonx.ai

You can ask follow up questions, for example, to summarize any of the papers.

Next Steps