2025: The year open, agentic AI took center stage

Wide view of a a productive developers session at office.
Anabelle Nicoud

Staff Writer

IBM

This article was featured in the Think newsletter. Get it in your inbox.

From agentic capabilities in experimental AI models to real-world deployment, 2025 was the year open, agentic AI captured attention. For IBM, this evolution is embodied in BeeAI and Agent Stack, two open-source initiatives that have reshaped how enterprises build and deploy intelligent agents.

BeeAI began as an internal project in early 2024, when IBM Research anticipated a pivotal shift toward agentic systems. “We started working on that as 2024 ramped up and it feels all very quaint now,” said Kate Blair, Director of Incubation and Technology Experiences at IBM Research, in an interview with IBM Think. “But we were focused on how end users might consume agents.”

Initially, the team explored conversational systems and launched an open beta called BeeAI for Business. But the momentum wasn’t right. The team pivoted, open-sourced BeeAI and focused on developers.

“We were starting to get feedback from developers,” said Blair. “Maybe instead of looking at the end user interface for consuming agents, we should look at the interface for developers building agents. And that’s where the framework came out of.”

Today, the BeeAI Framework is an open-source toolkit for building agents, emphasizing developer experience and interoperability. It supports multi-agent orchestration, rule-based reasoning and integrations with leading AI models. Under Linux Foundation governance, BeeAI has gained traction; it now has more than 3,000 GitHub stars.

Agent Stack: From GUI to deployment platform

As BeeAI grew, IBM discovered the biggest challenge wasn’t building agents, it was deploying them. Originally, the platform included a graphical user interface (GUI), but feedback revealed GUIs had become an expectation.

“The biggest piece of feedback we got was that GUIs have become largely commoditized,” said Blair. “Offering the UI as part of the stack is really not the core component. So, we’re focused on the server module. People want to focus on their agent logic. The killer function is ‘Let me deploy my agent quickly.’ That’s it.”

This insight drove a major pivot. The platform evolved into Agent Stack, an open infrastructure for fast, framework-agnostic deployment. Built on the Agent2Agent (A2A) protocol, Agent Stack lets developers take an agent built anywhere—BeeAI, LangGraph, CrewAI or custom code—and deploy it in minutes.

“Agent Stack is for people who need to experiment across frameworks behind a firewall,” said Blair. “They need to support multiple teams with multiple frameworks and compliance. They need to be self-hosted.”

Agent Stack addresses enterprise realities—compliance, security and scalability—while remaining open source, Blair explained. It offers a self-hosted server, command-line interface (CLI) tools, integrations for observability, authorization and auto-scaling. Its goal is to help teams move from prototype to production without vendor lock-in.

The latest AI trends, brought to you by experts

Get curated insights on the most important—and intriguing—AI news. Subscribe to our weekly Think newsletter. See the IBM Privacy Statement.

Thank you! You are subscribed.

Your subscription will be delivered in English. You will find an unsubscribe link in every newsletter. You can manage your subscriptions or unsubscribe here. Refer to our IBM Privacy Statement for more information.

2026 will be interoperable

Agent Stack isn’t just about deployment—it’s part of a broader movement toward interoperability. By leveraging A2A, it enables agents from different frameworks to communicate seamlessly.

The Linux Foundation recently announced the formation of the Agentic AI Foundation with Anthropic’s contribution of its Model Context Protocol (MCP). “We’re excited that MCP has come under open governance,” said Blair. “Openly governed, community standards are what is going to unlock more creativity, more innovation and more solutions.”

The A2A project is about to hit its first major release. “We’re already seeing collaboration between A2A and MCP to standardize on a single card to describe an entity, whether it’s a tool or resource in MCP or an agent in A2A,” she said. Blair sees this unified card as a catalyst for interoperability and the opportunity to share registries, discovery and utilization across agents and agentic systems.

Abstract portrayal of AI agent, shown in isometric view, acting as bridge between two systems
Related solutions
IBM® watsonx Orchestrate™ 

Easily design scalable AI assistants and agents, automate repetitive tasks and simplify complex processes with IBM® watsonx Orchestrate™.

Explore watsonx Orchestrate
IBM AI agents and assistants

Create breakthrough productivity with one of the industry's most comprehensive set of capabilities for helping businesses build, customize and manage AI agents and assistants. 

Explore AI agents
IBM Granite

Achieve over 90% cost savings with Granite's smaller and open models, designed for developer efficiency. These enterprise-ready models deliver exceptional performance against safety benchmarks and across a wide range of enterprise tasks from cybersecurity to RAG.

Explore Granite
Take the next step

Whether you choose to customize pre-built apps and skills or build and deploy custom agentic services using an AI studio, the IBM watsonx platform has you covered.

Explore watsonx Orchestrate Explore watsonx.ai