Build a custom research agent with Langflow by using watsonx Orchestrate ADK and IBM Cloud

Introduction

With the developing landscape of AI agents, some enterprises report hesitancies regarding widespread adoption of these autonomous agents.1 Some challenges include governance, ethics, human-AI collaboration, deployment and scalability. However, building trustworthy AI agents does not need to be complex. With IBM watsonx Orchestrate, these concerns can be mitigated all in one place. In this tutorial, you will learn how to use Langflow and watsonx Orchestrate to build reliable and scalable enterprise-ready agents.

What is Langflow?

Langflow is a Python-based, open-source framework for building AI agents and other AI applications. Langflow was originally built on LangChain, its parent framework. LangGraph, another platform within the same family, is also used to build agentic systems by using graph-based architectures. What sets Langflow apart is its easy to use, drag-and-drop interface in which users can connect agent components to design personalized workflows or start with the prebuilt templates. As an alternative to this low-code or no-code approach, developers can build custom components and embed step-by-step agent flows into existing application code by using the Langflow API. In this Langflow tutorial, we explore how to build and deploy agents by:

  1. Importing agentic LangFlow flows as MCP servers by using IBM® watsonx Orchestrate® as a Software as a Service (SaaS) along with IBM Cloud®.
  2. Importing basic LangFlow flows by using the IBM watsonx Orchestrate Agent Development Kit (ADK) locally.

Both approaches correspond to a section of this tutorial. Note, you can also access this tutorial on GitHub.

Let’s begin!

Prerequisites

This tutorial requires:

  • The latest version of Python installed.
  • An IBM Cloud® account to create a watsonx.ai™ project ID and API key. Lite and Free service plans are available.
    • You can get your project ID from within your project. Click the Manage tab. Then, copy the project ID from the Details section of the General page. You need this ID for the ADK portion of this tutorial.
  • watsonx Orchestrate account (can be a trial account). If you do not have an account already, click here to sign up for a free 30-day trial. Check out the documentation for detailed instructions on how to receive trial access on IBM Cloud.
  • The IBM watsonx Orchestrate ADK installed. Follow the official documentation here for setting up and installing the ADK.
    • Note: If you have previously installed watsonx Orchestrate Developer edition before ADK version 2.0, first remove all containers by running orchestrate server reset before upgrading. watsonx Orchestrate Developer edition no longer has a dependency on an external container engine. Failing to reset before upgrading can result in multiple installations of the application consuming unnecessary system resources and will result in port conflicts.
  • The IBM Cloud CLI installed. The installation commands for MacOS, Linux and Windows can be found in the getting started guide.

Please note that without these requirements, this tutorial will not be reproducible.

Steps: IBM Cloud approach

Step 1. Configure your IBM Cloud environment

In your terminal, run the following command. You will be prompted to log in to your IBM Cloud account by using your IBMid. If you have several accounts, you will need to select one.

ibmcloud login

Note: Credentials rejected? You might be a federated user. Log in again using the --sso  flag to use a corporate or enterprise single sign-on ID. For more information about logging in with federated IDs, refer to the documentation. In short, when prompted, allow the URL to open in your default browser and paste the one-time code that appears back into your terminal.

You will know that your sign-on was successful upon seeing similar output.

Output:

API endpoint: https://cloud.ibm.com
Region: us-south
User: your.email@email.com
Account: itz-watsonx-event-001 (f1zzz9a2e11b432ea5316227cb901888) <-> 3021952
Resource group: No resource group targeted, use ‘ibmcloud target -g RESOURCE_GROUP’

Note: If you notice that the region is not correct, run ibmcloud target -r  followed by correct region. For example, if your regional service endpoint is us-east , run ibmcloud target -r us-east .

To see your Cloud resources, run ibmcloud resource groups . This command should result in retrieve your resource groups and produce similar output (the names and IDs of your resources will differ).

Output:

Retrieving all resource groups under account f1zzz9a2e11b432ea5316227cb901888 as your.email@email.com...
OK
Name ID Default Group State
watsonx 93018fa55c342de104afb8jje20c222c false ACTIVE
itz-wxo-69305f32086a49ee3736ff 48bbeb07ec5a4994b2fd39beb6027090 false ACTIVE

Next, target the specific resource, by running ibmcloud target -g RESOURCE_GROUP . In this example, the command would be ibmcloud target -g itz-wxo-69305f32086a49ee3736ff .

This result should be similar in output.

Output:

Targeted resource group itz-wxo-69305f32086a49ee3736ff
API endpoint: https://cloud.ibm.com
Region: us-south
User: your.email@email.com
Account: itz-watsonx-event-001 (f1zzz9a2e11b432ea5316227cb901888) <-> 3021952
Resource group: itz-wxo-69305f32086a49ee3736ff

Step 2. Install the IBM Cloud Code Engine CLI

With IBM Cloud® Code Engine, you can run nearly any containerized workload without managing servers or infrastructure. The platform supports everything from microservices and web apps to batch jobs and event-driven functions. It also offers built-in image creation from your source code. Because all workloads share the same Kubernetes environment, they integrate naturally. Code Engine is designed to keep infrastructure out of your way so you can stay focused on building applications. Installing the Code Engine CLI is our next step. Run the following command in your terminal.

ibmcloud plugin install code-engine -f

Output:

Looking up ‘code-engine’ from repository ‘IBM Cloud’...
Plug-in ‘code-engine[ce] 1.57.0’ found in repository ‘IBM Cloud’
Attempting to download the binary file...
74.08 MiB / 74.08 MiB [============================================] 100.00% 1s
77680050 bytes downloaded
Installing binary...
OK
Plug-in ‘code-engine 1.57.0’ was successfully installed into /your/path/to/code-engine. Use ‘ibmcloud plugin show code-engine’ to show its details.

Great! Now, let’s target a project within Code Engine. First, display the list of projects by running ibmcloud ce project list .

Output:

Getting projects...
OK

Name ID Status Enabled Selected Tags Region Resource Group Age
ce-itz-wxo-69305f32086a49ee3736ff 8991a30c-944f-422d-9e00-00789043e90e active true false us-south itz-wxo-69305f32086a49ee3736ff 7m32s

A Code Engine project groups entities like applications, jobs and builds. It serves as the unit for managing resources and governing access to those entities. If you do not have any active Code Engine projects, run ibmcloud ce project create --name PROJECT_NAME  and replace CE_PROJECT_NAME  with any project name of your choice (for example, “code-engine-project”).

To target the specific project, run ibmcloud ce project select --name CE_PROJECT_NAME . In this example, the command would be ibmcloud ce project select --name ce-itz-wxo-69305f32086a49ee3736ff .

Output:

Selecting project ‘ce-itz-wxo-69305f32086a49ee3736ff’...
OK

Note: If you experience an error here, ensure that you have targeted the correct environment with ibmcloud target -c ACCOUNT_ID -r REGION_NAME -g RESOURCE_GROUP_NAME .

Step 3. Set up Langflow with the Code Engine CLI

To use Langflow while using the Code Engine CLI, run the following command.

ibmcloud ce app create \
--name langflow \
--image langflowai/langflow:latest \
--port 7860

This command can run for several minutes. Do not interfere and allow it to run.

Output:

Creating application ‘langflow’...
Configuration ‘langflow’ is waiting for a Revision to become ready.
Ingress has not yet been reconciled.
Waiting for load balancer to be ready.
Run ‘ibmcloud ce application get -n langflow’ to check the application status.
OK

https://langflow.23h82g3y09cp.us-south.codeengine.appdomain.cloud

Step 4. Connect wxO and activate Langflow

As the Langflow set up command runs, we can add an environment to interface with our local machine and out SaaS hosted solution.

In a new terminal window, activate a virtual environment in your chosen directory. You can change my-env  to any environment name of your choice.

python -m venv my-env

Activate the virtual environment with the following command and replace my-env  with your environment name if you used a different one:

MacOS/Linux:

source my-env/bin/activate

Windows:

my-env\Scripts\activate

Now, in your preferred browser, access your IBM Cloud resource list, expand the AI/Machine Learning dropdown and select the active wxO resource. Its name should resemble “Watson Orchestrate-itz,” for example, then copy the URL found in the Credentials window. Keep this page open in your browser as you will need it shortly. In the following command, replace YOUR_WXO_RESOURCE_URL  with the retrieved URL and run the command in the activated virtual environment in your terminal.

orchestrate env add \
-n langflow \
-u YOUR_WXO_RESOURCE_URL \
--type ibm_iam \
--activate

Output:

[INFO] - Environment ‘langflow’ has been created
Please enter WXO API key:

When prompted for the wxO API key, return to the resource page that is open in your browser. Do not input the API key located above the URL you copied. Instead, click the Launch watsonx Orchestrate button. Then, click the circular icon with your initials in the upper right corner of the screen and open the Settings. Select the API details tab and click the Generate API key button. Next, input any name and description for your API key and select “Disable the leaked key” in the “Leaked action” section. Most importantly, select “Yes” in the “Session management” section to enable session management for CLI logins and click “Create.” Your API key should appear. Please copy and paste the key in the terminal you were using moments ago to satisfy the request to enter the wxO API key.

Output:

[INFO] - Environment ‘langflow’ is now active

Great! Langflow is now active.

Step 5. Configure your Code Engine resources

To establish a stable Code Engine environment that does not delete our Langflow application after a fixed timeout period, open your browser once more. Access your IBM Cloud containers overview. You should see your Code Engine project appear as it is most recently created. Open the project. Next, open your Langflow application. In the Configuration tab, open the Resources and scaling component. The only change that we need to make here is to increase the minimum number of instances from 0 to 1. Finally, click the Deploy button to apply this configuration revision.

Once you’ve completed this step, click the Test application button and click the Application URL hyperlink. This action will open our IBM Cloud instance of Langflow.

Customizing Code Engine configurations

Step 6. Build your flow

There are many ways to build a Langflow flow. Either use prebuilt templates or create your own from scratch. In this tutorial, we will explore the latter. To start, click + Blank Flow. This example demonstrates a flow that you can build, but feel free to explore the plethora of components and integrations built into Langflow.

  1. Add the following built-in component nodes from the menu:
  • Chat Input - Receives the user input from the chat.

  • Chat Output - Returns the flow output back to the user in the chat.

  • Agent - Uses a large language model (LLM) integration to respond to user input and can be connected to several tools.2

  • MCP Tools - Connects to a Model Context Protocol (MCP) server and exposes the MCP server’s functions as tools for agents to use to respond to input.2

  • IBM watsonx.ai - Provides access to IBM watsonx.ai® models for text generation.3

  • News Search - Pulls Google News content and produces a structured DataFrame containing each article’s title, link, publication date, and summary.4

  • arXiv - Searches arXiv.org for relevant papers and outputs the results in a DataFrame format.5

    Lastly, select + New Custom Component at the bottom of the menu.

    For ease of visualization, arrange your flow in the following manner.

Final Langflow flow

2.    Connect the Chat Input component to the “Input” field of the Agent component.

3.    Connect the Chat Output component to the “Response” field of the Agent component.

4.    In the Agent component, set the “Model Provider” to “Custom” from the dropdown. Depending on the Langflow version that you are using, you might see “Connect other models” instead. Either option is acceptable.

5.    In the IBM watsonx.ai component, select the appropriate watsonx.ai API endpoint for your API credentials. Next, paste your watsonx.ai project ID and API key in the appropriate fields. Then, select the large language model name of your choice. In this tutorial, we can select openai/gpt-oss-120b . Ensure that the component is set to “Language Model,” rather than “Model Response.” This setting is important because we want to use this model as the language model for our agent. Hence, we can now connect the IBM watsonx.ai component to the “Language Model” field of the Agent component.

  • Note: If you prefer to use global variables rather than pasting your API credentials directly, click your profile icon in the upper right corner of the screen and select Settings. In the Global Variables section, add your WATSONX_PROJECT_ID  and WATSONX_APIKEY  for your watsonx.ai connection that you generated as a prerequisite to this tutorial. Once you head back into your flow, you should see a globe icon in the “watsonx.ai Project ID” and “API key” text fields. Click the icon and select the appropriate key from the dropdown.

6.    Enable Tools Mode by using the toggle for the arXivNews Search and Custom Component. You will see this toggle appear in the header menu by clicking anywhere on each of these components. With this mode enabled, you can now connect these components to the “Tools” field of the Agent component. The arXiv and News Search components are configured and ready to be used. We can focus on configuring the other components now.

7.    In the header menu of the Custom Component, select <> Code. Here, we can customize the component’s behavior by editing the Python code that defines it.6 As a simple example, we can create a tool that returns today’s date, otherwise inaccessible knowledge to an LLM. Replace the boilerplate code with the following:

from langflow.custom.custom_component.component import Component
from langflow.io import MessageTextInput, Output
from langflow.schema.data import Data
from datetime import date

class CustomComponent(Component):
    display_name = “Date”
    description = “Returns today’s date.”
    documentation: str = “https://docs.langflow.org/components-custom-components”
    icon = “calendar-check”
    name = “CustomDateComponent”

    inputs = [] # No input needed

    outputs = [
        Output(display_name=”Today’s Date”, name=”output”, method=”build_output”),
    ]

def build_output(self) -> Data:
    today = date.today()
    data = Data(value=today)
    self.status = data
    return data

Go ahead and save your changes. You should now see the component reflecting its new name, description and icon.

8.    The number of tools you provide to your agent is up to you. However, remember not to overwhelm your agent with too many tools as it might hinder performance and accuracy. The last tool that we will enable is an MCP server. You can connect to any server of your choice. In this tutorial, we can connect to the Alpha Vantage MCP server.7 The official Alpha Vantage MCP server makes it easy for LLMs and agents to pull real-time and past stock data by using the Model Context Protocol. To connect to this server, open the “MCP Server” dropdown in your MCP Tools component and click + Add MCP Server. In the STDIO tab, provide any name to your server, for example, “av_mcp,” and paste the following command: uvx av-mcp YOUR_API_KEY . To generate a free Alpha Vantage API key, visit the official Alpha Vantage website and paste it into the command, replacing the YOUR_API_KEY  placeholder. Once you add the server, enable the Tools Mode toggle in the component node’s header menu. You will see that a vast list of tools appear as “Actions.” This result means that your connection to the MCP server is successful. You can now connect this final component to the “Tools” field of the Agent component.

Adding Alpha Vantage MCP server

Nice work! Your flow is complete and should resemble the following screen capture.

Final Langflow flow

To verify that the research pipeline is functioning as expected, open the Playground and chat with your newly built agent! Ask the agent questions that require an invocation of one of the connected tools. Some sample input values include:

  • Simple:
    • “What is today’s date?”
    • “Find me 5 research papers on quantum computing.”
  • Moderate:
    • “Analyze the stock price of IBM over the last 30 days.”
  • Complex:
    • “Is there any recent news that would indicate that IBM stock is bullish or bearish?”

You should see the agent invoke its available tools and produce the correct output. If you experience any issues at this stage, please return to your flow and ensure that your credentials are correct and that you have followed each of the steps.

Step 7. Import your flow to wxO as an MCP server

One way to connect this flow to watsonx Orchestrate is as an MCP server. Click the Share dropdown in the upper-right corner and select “MCP Server.” Click into the “JSON” tab. You should see code similar to this example:

{
    “mcpServers”: {
    “lf-starter_project”: {
    “command”: “uvx”,
    “args”: [
        “mcp-proxy”,
        “https://langflow.23h82g3y09cp.us-south.codeengine.appdomain.cloud/api/v1/mcp/project/b797fbc9-cd21-46e9-bc23-8fa813f94810/sse”
            ]
        }    
    }    
}

Copy the URL in your JSON snippet. Note that it will differ from the preceding example. Return to your terminal and paste your MCP server URL in place of the MCP_SERVER_URL  placeholder. Running this following command in the watsonx Orchestrate CLI allows us to import this MCP server to the platform as a toolkit.

orchestrate toolkits add \
--kind mcp \
--name langflow_mcp \
--description “LangFlow MCP Server” \
--command “uvx mcp-proxy MCP_SERVER_URL” \
--tools “*”

Output:

[INFO] - Successfully imported tool kit langflow_mcp

Step 8. Create an agent and test the tool calls

In your browser, head over to watsonx Orchestrate and build a new agent from scratch. Enter any name and description for your agent. Once created, open the Toolset tab, click the Add tool button. From there, select to import tools from an MCP server. From the Select MCP server dropdown, select the server we imported, activate the tools by toggling the activation and close the window. Next, click Deploy. Once deployed, you can chat with your agent in the Preview chat window or in the chat interface found in the collapsed page menu.

Let’s ask our agent a query! For example, “Find 5 research papers on quantum computing.”

Research paper query output

Awesome! The agentic chatbot is behaving as expected by not only outputting the correct response, but also invoking the correct arXiv tool. Feel free to experiment with different prompts.

Steps: ADK approach (local)

With this approach, Code Engine is not required. This approach sets up a local development environment by using the watsonx Orchestrate Developer Edition SDK, a lightweight version of watsonx Orchestrate that functions as a local development server.

Prerequisites

  • Machine specs:
    • 16GB RAM
    • 8 cores
    • 25GB disk space

Step 1. Install the wxO Developer Edition SDK

Before you start building locally with Langflow, install the Developer Edition of the wxO ADK. Note that this Developer Edition was not required for the first approach in this tutorial.

  1. In your preferred IDE, set up your environment. Create a folder named wxo-langflow-agent  to store all agents and tools. You can find this project on Github to use as a reference. The folder structure should be:

    wxo-langflow-agent/
    ├── .env
    ├── tools/
    └── agents/
    

2.  Open the terminal and activate a virtual environment. You can change my-env  to be any environment name of your choice.

python -m venv my-env

Activate the virtual environment with the following command and replace my-env  with your environment name if you used a different one:

MacOS/Linux:

source my-env/bin/activate

Windows:

my-env\Scripts\activate

3.  In the .env file, set the following environment variables. For more information, see the setup guide.

WO_DEVELOPER_EDITION_SOURCE=orchestrate
WO_INSTANCE=<service_instance_url>
WO_API_KEY=<wxo_api_key>

4.  Run the following command to install the watsonx Orchestrate Developer Edition server. We do not need to install Langflow because it is included as part of the ADK Developer Edition. The --with-langflow  command flag enables Langflow support by pulling the necessary container images and performing the initial configuration to make Langflow accessible locally.

orchestrate server start -e <path-.env-file> --with-langflow

This command can run for several minutes if it is your first time activating a server.

Troubleshooting: If you installed watsonx Orchestrate Developer Edition before ADK version 2.0 in the past and you experience errors for starting containers, run:

orchestrate server reset
orchestrate server purge
pip install --upgrade ibm-watsonx-orchestrate

The tail-end of your output should resemble the following example.

Output:

[INFO] - Migration ran successfully.
[INFO] - Waiting for orchestrate server to be fully initialized and ready...
[INFO] - Orchestrate services initialized successfully
[INFO] - no local tenant found. A default tenant is created
[INFO] - You can run `orchestrate env activate local` to set your environment or `orchestrate chat start` to start the UI service and begin chatting.
[INFO] - Langflow has been enabled, the Langflow UI is available at http://localhost:7861

Step 2. Activate local wxO chat UI

  1. The watsonx Orchestrate ADK defines environments as the watsonx Orchestrate instances you can connect to. Your environment will be a Developer Edition instance running on your laptop. You can use the orchestrate env list command to list all environments currently available to your CLI. By default, you have a local one. You can run the following command to activate the local environment.

    orchestrate env activate local
    

    Output:

    [INFO] - local tenant found
    [INFO] - Environment 'local' is now active
    
  2. Next, run this command to start the chat UI in your default browser.

    orchestrate chat start
    

    Output:

    [INFO] - Chat UI Service started successfully.
    [INFO] - Waiting for UI component to be initialized...
    [INFO] - Opening chat interface at http://localhost:3000/chat-lite
    

Step 3. Create a Langflow flow

The Langflow editor is available through the watsonx Orchestrate Developer Edition on port 7861, as seen in the previous output.

  1. In your web browser, navigate to http://localhost:7861.
  2. Build your flow. We can repurpose the flow built in the first portion of this tutorial or you can build your own. You might find it helpful later in this tutorial to give your flow a custom name and description instead of the default ones. You can do so by hovering over the flow name at the top of your screen and clicking the pencil icon.
  • Example flow name: Research agent
  • Example flow description: Access to news search, arXiv, today’s date and the Alpha Vantage API.

Step 4. Import the flow to wxO

We can explore two options for importing Langflow flows to our local watsonx Orchestrate server:

a) Import the flow as a local MCP server.

b) Import the flow as JSON.

Option 1: Import as a local MCP server

This step resembles step 7 in the first half of this tutorial with some minor changes. 

  1. Click the Share dropdown in the upper-right corner of your Langflow and select "MCP Server." Click into the "JSON" tab. You should see code similar to the following example:

    {
        "mcpServers": {
            "lf-starter_project": {
            "command": "uvx",
            "args": [
                "mcp-proxy",
                "http://localhost:7861/api/v1/mcp/project/41c9434f-67bf-439e-8dac-b7bb09b1d1ca/sse"
                ]
            }
        }
    }
    
  2. Copy the URL in your JSON snippet. Note that it will differ from the preceding example. Return to your terminal and paste the following command by using your URL instead of the following example one. Unlike the command we ran when using IBM Cloud, here you must replace localhost with host.docker.internal. Here is an example:

    orchestrate toolkits add \
        --kind mcp \
        --name langflow_research_mcp \
        --description "LangFlow MCP Server" \
        --command "uvx mcp-proxy http://host.docker.internal:7861/api/v1/mcp/project/41c9434f-67bf-439e-8dac-b7bb09b1d1ca/sse" \
        --tools "*"
    

    Output:

    [INFO] - Successfully imported tool kit langflow_research_mcp
    
  3. In the local instance of watsonx Orchestrate running on your browser, click "Create new agent" and enter a name and description for your new agent. Then, click the Create button.

Create agent locally in UI

4.    In the Toolset tab, click the Add tool button. Select “local instance” to add our already imported MCP server, check the box corresponding to the imported MCP server and click Add to agent.

Add MCP server as a tool

5.    Start chatting!

Agent chat with MCP server as a tool

Option 2: Import as JSON

As an alternative to importing flows as MCP servers, we can use the ADK to import flows as exported JSON files.

1. This approach works best for simple flows. For demonstrative purposes, let’s use this flow:

arXiv tool in Langflow

Export the flow to JSON by clicking the Share button and selecting Export. Enter any name and description for the tool/flow that you would like. The tool name must contain only alphanumeric characters and underscores and must not start with a number or underscore.

2.  Add the newly exported JSON file to the tools  folder. 

3.  Run the following command to import your flow to watsonx Orchestrate.

orchestrate tools import -k langflow -f tools/arxiv.json

4.  After importing the Langflow flow as a tool, the next step is to connect it to an agent system. You could do this step by creating a new agent in the watsonx Orchestrate UI or by copying the following agent definition to a new arxiv_agent.yml  file in your agents  folder.

kind: native
name: arxiv_agent
display_name: ArXiv Agent
description: Access to arXiv tool.
context_access_enabled: true
context_variables: []
llm: watsonx/ibm/granite-4-h-small
style: default
instructions: ‘’
guidelines: []
collaborators: []
tools:
- arxiv
knowledge_base: []
spec_version: v1

Now, import the simple agent by running:

orchestrate agents import -f agents/arxiv_agent.yml

5.  Refresh the browser of the locally running watsonx Orchestrate chat UI to see our changes reflected. In the Agents dropdown menu, select the “ArXiv Agent,” and ask questions that would require the use of the arXiv tool!

Example prompt: “Find me 5 research papers on quantum computing.”

Output:

Chat with local arXiv agent

Great! The agent determined it was necessary to invoke the arxiv  tool for this user query. The tool output is displayed in the collapsed reasoning flow and in the chat window as a response.

Conclusion

This tutorial has equipped you with the essential skills to leverage Langflow and watsonx Orchestrate for constructing robust, scalable, and enterprise-ready agents. You learned how to import agentic LangFlow flows as MCP servers by using watsonx Orchestrate as a Software as a Service (SaaS) along with IBM Cloud. Also, you are now familiar with how to import basic LangFlow flows when using the IBM watsonx Orchestrate Agent Development Kit (ADK) locally. By following the step-by-step instructions, you’ve learned to design, develop, and deploy agents that invoke custom and prebuilt tools to solve user queries. With Langflow’s intuitive visual interface, you created intricate workflows and watsonx Orchestrate has empowered you to manage and scale these agents efficiently. As a next step, apply the knowledge gained from this tutorial by working on a real-world use case. Choose a specific business problem or process within your organization that could benefit from automation and design a Langflow and watsonx Orchestrate-based solution to address it. This hands-on experience will solidify your understanding and help you identify areas for further improvement or exploration.

If you encounter issues or have questions, check the documentation. Most common issues are covered in the troubleshooting guide. You can also review GitHub issues to see if others have experienced similar problems.

Anna Gutowska

AI Engineer, Developer Advocate

IBM

Abstract portrayal of AI agent, shown in isometric view, acting as bridge between two systems
Related solutions
IBM® watsonx Orchestrate™ 

Easily design scalable AI assistants and agents, automate repetitive tasks and simplify complex processes with IBM® watsonx Orchestrate™.

Explore watsonx Orchestrate
Artificial intelligence solutions

Put AI to work in your business with IBM's industry-leading AI expertise and portfolio of solutions at your side.

Explore AI solutions
AI consulting and services

Reinvent critical workflows and operations by adding AI to maximize experiences, real-time decision-making and business value.

Explore AI services
Take the next step

Whether you choose to customize pre-built apps and skills or build and deploy custom agentic services using an AI studio, the IBM watsonx platform has you covered.

Explore watsonx Orchestrate Explore watsonx.ai
Footnotes

1 Satyadhar Joshi. “Review of Autonomous Systems and Collaborative AI Agent Frameworks.” International Journal of Science and Research Archive, vol. 14, no. 2, 28 Feb. 2025, pp. 961–972, https://doi.org/10.30574/ijsra.2025.14.2.0439.

2 “Agents | Langflow Documentation.” Langflow.org, 2025, docs.langflow.org/components-agents.

3 “IBM | Langflow Documentation.” Langflow.org, 2025, docs.langflow.org/bundles-ibm.

4 “Data | Langflow Documentation.” Langflow.org, 2025, docs.langflow.org/components-data.

5 “ArXiv | Langflow Documentation.” Langflow.org, 2025, docs.langflow.org/bundles-arxiv.

6 “Components Overview | Langflow Documentation.” Langflow.org, 2025, docs.langflow.org/concepts-components.

7 “Alpha Vantage MCP for Stock Market Data.” Alphavantage.co, 2025, mcp.alphavantage.co/.