IBM announces new AI assistant and feature innovations at Think 2024

Coworkers chat informally over coffee at desk

As organizations integrate artificial intelligence (AI) into their operations, AI assistants that merge generative AI with automation are proving to be key productivity drivers. Despite various barriers to AI, these assistants combine generative AI and automation. This integration helps improve productivity by transforming how we work, offloading repetitive tasks, enabling self-service actions, and providing guidance on completing end-to-end processes.

AI assistants from IBM facilitate enterprise adoption of AI to modernize business operations. They are purpose-built, tailored to specific use cases, automate end-to-end workflows, and integrate with enterprise processes, data and applications.

Today, IBM announces the general availability of several IBM watsonx™ assistant innovations initially previewed at our annual Think conference in May 2024, including the new IBM watsonx™ Assistant for Z.

Several new enhancements to the family of watsonx assistants are now generally available, including new features within IBM watsonx™ Code Assistant for Z, IBM watsonx™ Code Assistant for Red Hat® Ansible® Lightspeed, IBM watsonx™ Orchestrate, IBM watsonx™ Assistant and the newest AI assistant introduced at Think, watsonx Assistant for Z.

watsonx Assistant for Z is now available

IBM watsonx Assistant for Z is a new generative AI assistant that supports teams as they engage with and manage the mainframe. The software puts decades of experience at the user’s fingertips, codifying the knowledge of IBM Z® experts into a set of automations that can assist system programmers, operators and developers of all experience levels.

Watsonx Assistant for Z, built on IBM watsonx Orchestrate in partnership with IBM Research®, uses the chat-focused granite.13b.labrador model, an enterprise-level model trained on IBM Z-specific content. The assistant uses an IBM Z domain-specific retrieval augmented generation (RAG) framework to provide curated content.

Users can also import existing automations created with Ansible, JCL and REXX as skills into the product catalog. These skills can be used through a contextually aware chat experience that uses the watsonx Orchestrate extensible automation framework. Learn more about IBM watsonx Assistant for Z.

New low-code AI assistant builder capability debuts in watsonx Orchestrate

The AI assistant builder, a new feature within the IBM watsonx Orchestrate platform, is now generally available and empowers users to create new AI assistants using natural language.

Watsonx Orchestrate powers the development of purpose-built assistants to help boost productivity across the enterprise. Delivered by our open and transparent large language models (LLMs), the Orchestrate platform offers builders new capabilities such as content grounding, AI-guided actions, and a conversational interface to help scale the enterprise’s investment in automation and AI adoption. Enterprises can now use watsonx Orchestrate to deploy business domain-specific, LLM-powered assistants that are grounded in business context, data and automation.

Also, watsonx Orchestrate features a new user experience that enables multi-assistant routing with support for asynchronous skill execution. This means that when users submit an inquiry to one assistant, watsonx Orchestrate helps route the inquiry to the most appropriate assistant to accomplish that task.

Conversational search now available in watsonx Orchestrate and watsonx Assistant

In early June, IBM announced the general availability of conversational search, a new feature designed to deliver RAG by using IBM® watsonx.ai™ LLMs. Combined with the advanced search capabilities of IBM watsonx™ Discovery, powered by Elasticsearch, users can deliver high-quality and accurate responses to customers grounded in business-relevant content and enterprise-ready LLMs within minutes. This accelerates the build and authoring process while offering traceability of sources that help ensure trust and transparency in answers.

IBM watsonx large speech models (LSMs) were also made available to watsonx Assistant users earlier this month as an add-on and will be available on watsonx Orchestrate later this year. LSMs require a vast amount of model training data and parameters to deliver accurate speech recognition. Consequently, the IBM research and development teams have designed enterprise-ready models to support customer service use cases.

In addition to English, Japanese and French are available in beta, with plans to expand to Spanish and Portuguese later this year. Based on internal benchmarking from earlier this year, the new English-language LSM outperforms OpenAI’s Whisper model on short-form English use cases.

New enhancements in watsonx Code Assistant for Z

IBM watsonx Code Assistant for Z accelerates the end-to-end mainframe application lifecycle with generative AI and automation, as illustrated. Earlier this year, it introduced a unified experience for its capabilities within Visual Studio Code (VS Code).

In May 2024, IBM previewed a new feature, code explanation, now generally available as an add-on to watsonx Code Assistant for Z. This new capability uses generative AI to create natural language explanations of COBOL code, helping developers understand applications and simplify the process of creating and maintaining documentation, and facilitate learning for new developers.

Capabilities supporting the understanding, refactoring, transformation and validationstages of the lifecycle are now fully available for on-premises deployments as well. For more details on the new code explanation capabilities or the graphic, click here.

Playbook generation in watsonx Code Assistant for Red Hat Ansible Lightspeed

In early 2024, IBM watsonx Code Assistant for Red Hat Ansible Lightspeed introduced model customization, a feature that enables users to further tune the underlying IBM® Granite™ code model with their Ansible playbooks and content for a more personalized experience.

Building on this momentum, IBM announced the on-premises release of watsonx Code Assistant for Red Hat Ansible Lightspeed in early May, which is now generally available.

The latest as a service release of watsonx Code Assistant for Red Hat Ansible Lightspeed also features new capabilities, including a chat-style experience for full Playbook generation and Playbook explanation. Playbook generation enables users to create and explain both new and existing Ansible content from natural language prompts. The assistant receives user prompts and returns an outline of a complete Ansible playbook, simplifying the process of creating and understanding Ansible content. Read the details on the new playbook generation capabilities.

Purpose-built for enterprise

The models that power each watsonx AI assistant are purpose-built for specific domains, ranging from customer service to software development and more.

The IBM Granite model is one of the most transparent LLMs in the worldoffering a clearer view of how the models work, the data they’re trained on, and how it’s weighted. Transparency into the models is as important as ensuring that they’re built to be used across different data sources, workflows, applications and any cloud or on-premises environment.

Every business is unique. That is why IBM AI assistants automate and orchestrate complex tasks across business processes and IT estates, tailoring to specific business needs. With the help of IBM Consulting® experts and a strong network of ecosystem partners, organizations can co-create frameworks that help identify the most impactful use cases for deploying the technology and transform how work gets done.

Transform your business with watsonx AI assistants. Contact an IBM expert to start your AI journey.

Author

Parul Mishra

Vice President, Product Management, watsonx Orchestrate

IBM

Keri Olson

VP, Product Management, AI for Code, watsonx Code Assistant

IBM