IBM researchers propose guidelines for novel analog memory devices to enable fast, energy-efficient and accurate AI hardware accelerators.
Scientists publish a new approach to phase change memory using only a single chemical element—antimony—in Nature Materials.
A capacitor-based cross-point array for analog neural networks offers potential orders of magnitude improvements in deep learning computations.
IBM scientists developed an artificial synaptic architecture, a significant step towards large-scale and energy efficient neuromorphic computing technology.
IBM scientists present a programmable AI core for multi-domain deep learning training and inference on chip at 2018 VLSI Circuits Symposium.
Analog memory for accurate, faster, lower power neural network training - a major step on the path to hardware accelerators for the next AI breakthroughs.
Appearing today in the peer review journal Nature Electronics, IBM scientists introduce a novel hybrid concept called mixed-precision in-memory computing which combines a von Neumann machine with a computational memory unit....
Today, at IBM THINK in Las Vegas, my colleagues and I are reporting a breakthrough in AI hardware performance using POWER9 with NVIDIA V100 GPUs.