IBM 8 Bar Logo

IBM AI Roadmap

Large-scale, self-supervised neural networks, which are known as foundation models, multiply the productivity and the multimodal capabilities of AI. More general forms of AI emerge to support reasoning and commonsense knowledge.

AI
Roadmap

Strategic milestones

All information being released represents IBM’s current intent, is subject to change or withdrawal, and represents only goals and objectives.

You can learn more about the progress of individual items by downloading the PDF in the top right corner.

2023

Extend foundation models beyond natural language processing.

In 2023, we will expand enterprise foundation model use cases beyond natural language processing (NLP). 100B+ parameter models will be operationalized for bespoke, targeted use cases, opening the door for broader enterprise adoption.

Why this matters for our clients and the world

The expansion of AI foundation models will lower the barrier for entry, broaden the use cases, reduce labeling requirements for training by 10-100x, and provide greater efficiencies through reuse of models across use cases.

The technologies and innovations that will make this possible

Prebuilt models, workflows, toolchains, and multimodal neural architectures will leverage foundation models over diverse domain-specific data such as code, IT, security, geospatial, and materials. OpenShift-based cloud-native middleware will help scale foundation model workloads to thousands of GPUs.

How these advancements will be delivered to IBM clients and partners

Watsonx will be launched with three elements: watsonx.data, watsonx.ai, watsonx.governance. The infrastructure will include resource- and topology-aware OpenShift clusters, and advanced networking between nodes and GPUs within a node.