AI Hardware

Unveiling Analog Memory-based Technologies to Advance AI at VLSI

At the 2019 VLSI, IBM researchers will present three papers that provide novel solutions to AI computing based on analog devices.

Continue reading

Ultra-Low-Precision Training of Deep Neural Networks

IBM researchers introduce accumulation bit-width scaling, addressing a critical need in ultra-low-precision hardware for training deep neural networks.

Continue reading

IBM Launches Research Collaboration Center to Drive Next-Generation AI Hardware

The IBM Research AI Hardware Center, a global research hub to develop next-generation AI hardware and help achieve AI's true potential.

Continue reading

8-Bit Precision for Training Deep Learning Systems

IBM scientists show, for the first time, successful training of deep neural networks using 8-bit floating point numbers while fully maintaining accuracy.

Continue reading

Dual 8-Bit Breakthroughs Bring AI to the Edge

IBM researchers showcase new 8-bit breakthroughs in hardware that will take AI further than it’s been before: right to the edge.

Continue reading

Steering Material Scientists to Better Memory Devices

IBM researchers propose guidelines for novel analog memory devices to enable fast, energy-efficient and accurate AI hardware accelerators.

Continue reading

Keep it Simple: Towards Single-Elemental Phase Change Memory

Scientists publish a new approach to phase change memory using only a single chemical element—antimony—in Nature Materials.

Continue reading

Capacitor-Based Architecture for AI Hardware Accelerators

A capacitor-based cross-point array for analog neural networks offers potential orders of magnitude improvements in deep learning computations.

Continue reading

Novel Synaptic Architecture for Brain Inspired Computing

IBM scientists developed an artificial synaptic architecture, a significant step towards large-scale and energy efficient neuromorphic computing technology.

Continue reading

Unlocking the Promise of Approximate Computing for On-Chip AI Acceleration

IBM scientists developed a digital accelerator core for AI hardware that uses approximate computing to improve compute efficiency.

Continue reading

Machine Learning for Analog Accelerators

A machine learning technique for evaluating materials used to make analog accelerators, whose lower power and faster speed can drive deep learning.

Continue reading