In this tutorial, you’ll build a simple Model Context Protocol (MCP) server that exposes a single tool for searching IBM tutorials. By using the fastmcp framework and the requests library, the script downloads a JSON index of tutorials from a remote URL. It then searches for matches to a user’s query and returns a neatly formatted list of results. You’ll also add error handling for network issues, bad JSON and unexpected problems, making the tool robust and beginner-friendly. Finally, you’ll run the MCP server so it can be connected to and tested with a client like Cursor.
Enterprise and startup developers alike are increasingly developing generative artificial intelligence (AI) driven solutions. In order to make their solutions more useful, they need up-to-date information and context. Machine learning models need to interoperate with tools, application programming interfaces(APIs), software development kits (SDKs) and front-end systems for that to happen.
The MCP standardizes how context is passed between AI models and systems. It simplifies coordination across a large language model (LLM) and external data sources and tools. A common analogy is to think of MCP as a USB-C port for an LLM. They make an LLM much more useful because the model has access to capabilities and data that weren't part of the model's training. This ability is especially useful when building AI agents.
MCP was developed by Anthropic and was adopted by major AI providers including OpenAI, Google DeepMind and the wider industry. It provides a secure and standardized way for AI models to access and use external data, resources (such as prompt templates) and tools.
Furthermore, integrated development environments (IDEs) such as Cursor and Visual Studio Code have also adopted MCP, allowing their AI assistants to access MCP servers to make their use more context relevant and developer friendly. Built as an open standard, organizations use MCP to act as a bridge between the stochastic world of generative AI and the deterministic world of most enterprise systems that exist today. MCP provides an LLM with contextual information similarly to other design patterns that have started to emerge such as retrieval augmented generation (RAG), tool calling and AI agents.
Some advantages to using MCP in comparison with these other solutions include:
- Scale: MCP servers can be defined and hosted once and used by many AI systems. This capacity limits the need to define access to the same source data, resources and AI tools for multiple generative AI systems.
- Data retrieval: Unlike RAG where data retrieval requires preprocessing and vectorizing before query, MCP is dynamic and allows for fluctuations and updates from information sources in real-time.
- Complexity: MCP is fairly simple to setup and incorporate into AI applications, as we demonstrate here. You can use config files easily to make an MCP server portable across environments.
- Platform-independent: Beyond the fact that you can build MCP servers with Python, TypeScript or other languages, they are also not coupled to a specific LLM solution.
- Debugging through a client/server model: The MCP client sends requests to the MCP server, which then fetches the necessary data from various external systems and sources—be it APIs, databases or local files. This structured approach ensures that the AI model receives consistent and relevant context, leading to more accurate and reliable outputs. MCP uses JSON-RPC to encode messages and supports 2 transport mechanisms, stdio and streamable HTTP. In previous iterations of the protocol, it also supported HTTP with server-sent events (SSE)
The need to constantly keep up to date with the latest information your enterprise needs can be daunting. MCP can help build context and incorporate new information for contracts as they are executed, legacy information that is being digitized but not necessarily made digestible and more. That information can be both internal and external, but adding context bypasses the time-consuming need to retrain an LLM in order to be useful.
There are many remote MCP servers available as well as plenty of reference implementations from github.com
This step-by-step guide can be found on our GitHub repository along with the server.py script you'll reference when creating your MCP server. In this tutorial we’ll walk through building a basic custom MCP server that can:
In order to facilitate making this tutorial, we have created a mechanism by which the server you will build can easily consume our tutorial content with no required authentication.
- Python 3.11 or newer installed on your computer (check by running python3 --version in your terminal).
- The built-in venv module is available (it comes with Python on most systems; on some Linux® distributions you might need to install it separately with sudo apt install python3-venv).
- A command line terminal (CLI):
- macOS or Linux: use your Terminal app (these environments are Unix-like).
- Windows: use PowerShell or Command Prompt, with small syntax differences explained in the next step.
- A text editor or IDE of your choice
Make a new directory and cd into it
Ensuring that you are in the directory
Note: On Windows, you might be able to replace
Once you have created the virtual environment, you need to activate it by using the following command
Once activated, your shell will likely show you
Now you need to install the Python package for
Install the
After this step completes, you can check that fastMCP is installed correctly by running the following command
If you get output similar to below, you have fastMCP installed in your virtual environment.
FastMCP version: 2.10.1
MCP version: 1.10.1
Python version: 3.11.13
Platform: macOS-15.5-arm64-arm-64bit
FastMCP root path: /opt/homebrew/lib/python3.11/site-packages
Now you have fastMCP installed, let’s get our MCP server created.
Create a new file in the directory and let’s give it the name
Once you have that file, open it and copy and paste the following code snippet into it
Let’s go through the preceding code and explain what the key parts are doing.
The script starts by importing FastMCP, which provides the framework for creating an MCP server, and requests, that are used to download data over HTTP. We’ve added a constant
We then create an MCP server instance with
The
The search is case-insensitive: the query is converted to lowercase, and each tutorial’s title and URL are also lowered for matching. If a tutorial contains the search term in either the title or the URL, it’s added to a list of relevant results.
If no tutorials match the search, the function returns a friendly message indicating that nothing was found. If there are matches, the function builds a formatted, numbered list showing each tutorial’s title, URL, date, and—if available—the author. The formatting uses Markdown-style bold for titles so they stand out in clients that support it. The final formatted text is returned as a single string.
The function includes targeted exception handling:
A general
Each error returns a descriptive message to the caller instead of stopping the program.
At the bottom, the
Now that you have built your server, you need to enable it in your IDE before you can use it. There are many clients that support MCP with various levels of integration with the protocol. The official MCP website provides an exhaustive list of example clients.
If you have Cursor installed, you can add the MCP server within Cursor by following these instructions.
Open Cursor settings and navigate to Tools & Integrations. Select New MCP Server and paste into the mcp.json file that Cursor opens in a new tab. Make sure that you replace <YOUR PATH> with the directory you’re in. You can run pwd in your terminal to get the full path. For further information about Cursor and MCP, see the Cursor docs.
{
"mcpServers": {
"tutorials": {
"command": "fastmcp",
"args": ["run <YOUR PATH>/ibmtutorialmcpserver/server.py"],
"env": {
}
}
}
}
If you are a user of Microsoft VS Code, you can add the MCP server by using the instructions linked here. Make sure that Copilot is set up in VS Code before proceeding.
If you'd like to enable the MCP server by using a file, create a .vscode/mcp.json file in this project's directory and copy and paste this code into the file. Make sure that you replace <YOUR PATH> with the directory you're in. You can run pwd in your terminal to get the full path.
"servers": {
"IBM Tutorials": {
"type": "stdio",
"command": "fastmcp",
"args": [
"run",
"<YOUR PATH>/ibmtutorialmcpserver/server.py"
]
},
},
Now that you’ve enabled your MCP server, let’s get the server running so you can use the tool created in server.py. If you’re using VS Code, check out these docs.
In the IDE chat, I’ll ask “what are some IBM time series tutorials?” The following shows that the output received, but your response can vary depending on the model used and your IDE.
Output:
Here are some IBM time series tutorials:
Time series forecasting with Lag-Llama (zero-shot learning)
Tutorial link
Predict overnight low temperatures using the Lag-Llama model in a zero-shot learning scenario.
Using the watsonx.ai Time Series Forecasting API to predict energy demand
Tutorial link
Predict energy demand with the watsonx.ai Time Series Forecasting API.
Authors: Aleksandra Kłeczek and Meredith Syed
Let me know if you want details or help with a specific tutorial.
Great! The agent was able to use the
In this tutorial, you learned to build an MCP server to search through all our tutorials by using any MCP client you prefer. You created an MCP server with a single search tool that retrieves a remote JSON index of tutorials, filtered results based on a search term and returned them in a readable format. It uses fastmcp to register and run the tool, requests to fetch the data and includes error handling for network, parsing and unexpected issues. When run, the server can be connected to by MCP clients for live querying of all our tutorials.
Build, deploy and manage powerful AI assistants and agents that automate workflows and processes with generative AI.
Build the future of your business with AI solutions that you can trust.
IBM Consulting AI services help reimagine how businesses work with AI for transformation.