As the AI market matures, one theme is becoming increasingly clear: customization is the new currency of enterprise value. Organizations want models that reflect their specific knowledge, tone, and compliance needs. At IBM, we believe the future of AI is not just open source, it’s open customization. And as AI becomes more embedded into enterprise decision-making, the ability to customize and control your models will be a competitive advantage, with IBM and Red Hat leading the charge on that future for enterprises of all sizes.
At Red Hat Summit 2025, the momentum behind that vision is accelerating, driven by two powerful solutions announced by IBM.
The first wave of enterprise AI revolved around foundation model access. The next wave will be defined by how well businesses can adapt those models to fit their unique needs. That’s why Red Hat AI InstructLab on IBM Cloud is such a critical innovation. It brings the power of InstructLab into a fully managed service, exclusive to IBM. IBM is currently the only provider offering InstructLab as-a-Service for enterprises, which provides a strategic advantage when building AI models that can be flexibly-consumed and customized to fit the needs of your clients.
InstructLab allows enterprises to fine-tune LLMs by avoiding massive retraining and teaching new behaviors and facts using synthetic instructions. This allows for faster customization, better data control, and significantly lower costs than traditional fine-tuning. And because it's delivered as-a-service on IBM Cloud, clients can skip the operational complexity and get straight to results.
Enterprises need a trusted, scalable foundation for developing, testing and running their LLMs whether in the cloud, on-prem, or at the edge. Red Hat Enterprise Linux AI is a purpose-built AI platform that includes the open-source Granite family of LLMs, with support for additional open models for inference workloads with integrated InstructLab tools. Think of it as an out-of-the-box environment for running and customizing large language models, grounded in the same enterprise-grade security and lifecycle management that RHEL is known for.
On IBM Cloud, clients can take advantage of this foundation with enhanced scalability, reliability, and governance. It’s the ideal landing zone for organizations looking to operationalize their AI efforts, while maintaining full visibility and control.
IBM’s work with Red Hat is helping define that stack where open models, open tools and open infrastructure can come together to give enterprises control over their AI future across a hybrid environment.
At Red Hat Summit on 19-22 May 2025 in Boston, we are diving deeper into how these technologies are being used by customers to deliver real outcomes and where we see the AI stack going next. If your organization is exploring how to build, tune, and run your own models with confidence, we’d love to talk. Meet us at IBM Booth 1133!
What's New at IBM newsletter
Get the biggest product and feature announcements, including recent video chats on products, and educational offerings from IBM and our training partners. See the IBM Privacy Statement.