AI

IBM Research AI: Advancing AI for industry and society

Boosting our understanding of microbial world with software repurposing

In our latest paper published in the Microbiome Journal, we propose a way to improve the speed, sensitivity and accuracy of what’s known as microbial functional profiling – determining what microbes in a specific environment are capable of.

Continue reading

Moving beyond the self-reported scale: Objectively measuring chronic pain with AI

Together with Boston Scientific, we are presenting research that details the feasibility and progress towards our new pain measurement method at the 2021 North American Neuromodulation Society Annual Meeting.

Continue reading

Peeking into AI’s ‘black box’ brain — with physics

Our team has developed Physics-informed Neural Networks (PINN) models where physics is integrated into the neural network’s learning process – dramatically boosting the AI’s ability to produce accurate results. Described in our recent paper, PINN models are made to respect physics laws that force boundaries on the results and generate a realistic output.

Continue reading

Who. What. Why. New IBM algorithm models how the order of prior actions impacts events

To address the problem of ordinal impacts, our team at IBM T. J. Watson Research Center has developed OGEMs – or Ordinal Graphical Event Models – new dynamic, probabilistic graphical models for events. These models are part of the broader family of statistical and causal models called graphical event models (GEMs) that represent temporal relations where the dynamics are governed by a multivariate point process.

Continue reading

IBM’s Squawk Bot AI helps make sense of financial data flood

In our recent work, we detail an AI and machine learning mechanism able to assist in correlating a large body of text with numerical data series used to describe financial performance as it evolves over time. Our deep learning-based system pulls out from large amounts of textual data potentially relevant and useful textual descriptions that explain the performance of a financial metric of interest – without the need of human experts or labelled data.

Continue reading

COVID-19 a year later: What have we learned?

We’ve learned a lot during the past year about how to address global crises, but in my mind, one lesson cannot be ignored: The need for more strategic collaborations across institutions and sectors.

Continue reading

IBM’s innovation: Topping the US patent list for 28 years running

A patent is evidence of an invention, protecting it through legal documentation, and importantly, published for all to read. The number of patents IBM produces each year – and in 2020, it was more than 9,130 US patents – demonstrates our continuous, never-ending commitment to research and innovation.

Continue reading

Light and in-memory computing help AI achieve ultra-low latency

Ever noticed that annoying lag that sometimes happens during the internet streaming from, say, your favorite football game? Called latency, this brief delay between a camera capturing an event and the event being shown to viewers is surely annoying during the decisive goal at a World Cup final. But it could be deadly for a […]

Continue reading

IBM-Stanford team’s solution of a longstanding problem could greatly boost AI

IBM-Stanford team’s solution of a longstanding problem could greatly boost AI.

Continue reading

Preparing deep learning for the real world – on a wide scale

Our recent MIT-IBM research, presented at Neurips 2020, deals with hacker-proofing deep neural networks - in other words, improving their adversarial robustness.

Continue reading

IEDM 2020: Advances in memory, analog AI and interconnects point to the future of hybrid cloud and AI

At this year’s IEEE International Electron Devices Meeting, IBM researchers will describe a number of breakthroughs aimed at advancing key hardware infrastructure components, including: Spin-Transfer Torque Magnetic Random-Access Memory (STT-MRAM), analog AI hardware, and advanced interconnect scaling designed to meet those hardware infrastructure demands.

Continue reading