IBM Research has partnered with Red Hat to bring iter8 into Kiali. Iter8 lets developers automate the progressive rollout of new microservice versions. From Kiali, developers can launch these rollouts interactively, watch their progress while iter8 shifts user traffic to the best microservice version, gain real-time insights into how competing versions (two or more) perform, and uncover trends on service metrics across versions.
New IBM, Fujifilm prototype breaks world record, delivers record 27X more areal density than today’s tape drives
Our recent MIT-IBM research, presented at Neurips 2020, deals with hacker-proofing deep neural networks - in other words, improving their adversarial robustness.
IEDM 2020: Advances in memory, analog AI and interconnects point to the future of hybrid cloud and AI
At this year’s IEEE International Electron Devices Meeting, IBM researchers will describe a number of breakthroughs aimed at advancing key hardware infrastructure components, including: Spin-Transfer Torque Magnetic Random-Access Memory (STT-MRAM), analog AI hardware, and advanced interconnect scaling designed to meet those hardware infrastructure demands.
Today, we are announcing a challenge for the computer vision community to develop robust models for object recognition, demonstrating accurate predictions on ObjectNet images.
The competition, called NLC2CMD for ‘Natural Language to Command,’ ran as part of the NeurIPS 2020 program until December – and this Saturday, we’ll finally see what the winners have come up with.
Deep learning may have revolutionized AI – boosting progress in computer vision and natural language processing and impacting nearly every industry. But even deep learning isn’t immune to hacking.
In 2021, our hybrid cloud predictions show that we expect businesses to address challenges in ways that will apply new resources and strategies to drive business outcomes, in a world that will continue to require new advances in cloud and AI research.
IBM researchers have created an AI-powered software to help doctors develop personalized treatments for different patients with the exact same diagnosis.
Enter microcontrollers of the future – the simplest, very small computers. They run on batteries for months or years and control the functions of the systems embedded in our home appliances and other electronics.
Our latest breakthrough in AI training, detailed in a paper presented at this year’s NeurIPS conference, is expected to dramatically cut AI training time and cost. So considerably in fact that it could help completely erase the blurry border between cloud and edge — offering a key technological upgrade for hybrid cloud infrastructures.
As we looked closer at the kinds of jobs our systems execute, we noticed a richer structure of quantum-classical interactions including multiple domains of latency. These domains include real-time computation, where calculations must complete within the coherence time of the qubits, and near-time computation, which tolerates larger latency but which should be more generic. The constraints of these two domains are sufficiently different that they demand distinct solutions.