Cognitive Computing

Meet an IBM scientist who’s using cognitive computing to discover new materials

Share this post:

Imagine trying to discover a new type of metal which is malleable and can withstand heat and rust based not only on decades of internal research and patents, but also external sources, such as peer review papers. Now realize this could be possible at the click of a button. In the near future the processes behind discovery and invention could become routine.

This is one of several promising applications of cognitive computing being developed by IBM scientist Dr. Peter Staar. Staar along with several of his colleagues at IBM’s Zurich Lab, were recognised today with the best paper award at the IEEE International Parallel & Distributed Processing Symposium (IPDPS) for their research titled Stochastic Matrix-Function Estimators Scalable Big-Data Kernels with High Performance.

Peter Staar

Staar writes one of his favourite algorithms.

Staar’s work focuses on three areas of research: the creation of knowledge graphs for a specific scientific fields, the development and implementation of new cognitive algorithms and applying these new algorithms on concrete problem-sets to discover new knowledge.

This past year Staar has been working with a global materials manufacturer to accelerate the discovery of new materials based on an algorithm he re-engineered. The result is an essential cognitive kernel, i.e. “subgraph-centrality”-algorithm, which speeds up a computers ability to find patterns in millions of documents by a factor of 100 – 500.

The algorithm also allowed the team to easily port the algorithm to standard computer circuit boards. This ‘porting’ is currently in the process of being patented and might be very valuable, since the required energy consumption of running the cognitive kernels will be cut by a factor of 10.

“This paper and the research it introduces is exactly why we need cross disciplinary research teams,” said Dr. Costas Bekas, Staar’s manager and the manager of the foundations for cognitive computing at IBM Research – Zurich. “We were able to optimise the kernel because we had both hardware and software experts tuning both pieces together. The results are truly extraordinary.”

Stochastic Matrix-Function Estimators Scalable Big-Data Kernels with High Performance
Peter Staar and Panagiotis Barkoutsos (IBM Research, Switzerland); Roxana Istrate (IBM Research, Switzerland); A. Cristiano I. Malossi (IBM Research, Switzerland); Ivano Tavernelli, Nikolaj Moll and Heiner Giefers (IBM Research, Switzerland); Christoph Hagleitner (IBM Research, Switzerland); Costas Bekas and Alessandro Curioni (IBM Research, Switzerland)
More stories

Privacy by Design for Financial Services Organizations in the GDPR Era

IBM Research understands data privacy for a modern business and has developed state-of-the-art solutions for protecting data in the GDPR era.

Continue reading

In-Memory Computing Using Photonic Memory Devices

IBM researchers discover that in-memory computing on an integrated photonic chip has the ability to further transform the computing landscape.

Continue reading

NeuNetS: Automating Neural Network Model Synthesis for Broader Adoption of AI

On December 14, 2018, IBM released NeuNetS, a fundamentally new capability that addresses the skills gap for the development of latest AI models for a wide range of business domains. NeuNetS uses AI to automatically synthesize deep neural network models faster and easier than ever before, scaling up the adoption of AI by companies and […]

Continue reading