AI

Open standards for deep learning to simplify development of neural networks

Share this post:

Among the various fields of exploration in artificial intelligence, deep learning is an exciting and increasingly important area of research that holds great potential for helping computers understand and extract meaning from data, e.g. deciphering images and sounds.

To help further the creation and adoption of interoperable deep learning models, IBM joined the Open Neural Network Exchange (ONNX), a new industry ecosystem that was established by Facebook and Microsoft in September. ONNX provides a common open format to represent deep learning models. The ONNX initiative envisions the flexibility to move deep learning models seamlessly between open-source frameworks to accelerate development for data scientists.

ONNX can free developers from the burden and constraints of having to commit to a specific deep learning framework during the research and development phase, and provide engineers and researchers the freedom to explore a variety of possibilities, enable them to more easily move between different deep learning frameworks and computational mediums, and choose the option that has the features best suited for their project.

We have already begun exploring the possibilities that ONNX can provide to developers and quickly recognized that our work on a common tensor output format could have a broader and more rapid impact if incorporated into ONNX. Our deep learning research team is already exploring other ways that ONNX can help data scientists bring their deep learning models to market.

IBM has long supported and actively encourages the adoption of open standards and collaborative innovation. ONNX will help encourage interoperability and foster an environment of deep learning systems that spur AI innovation and accelerate the development and use of neural networks in a wide range of research projects, products and solutions.

More AI stories

IBM RXN for Chemistry: Unveiling the grammar of the organic chemistry language

In our paper “Extraction of organic chemistry grammar from unsupervised learning of chemical reactions,” published in the peer-reviewed journal Science Advances, we extract the "grammar" of organic chemistry's "language" from a large number of organic chemistry reactions. For that, we used RXNMapper, a cutting-edge, open-source atom-mapping tool we developed.

Continue reading

From HPC Consortium’s success to National Strategic Computing Reserve

Founded in March 2020 just as the pandemic’s wave was starting to wash over the world, the Consortium has brought together 43 members with supercomputing resources. Private and public enterprises, academia, government and technology companies, many of whom are typically rivals. “It is simply unprecedented,” said Dario Gil, Senior Vice President and Director of IBM Research, one of the founding organizations. “The outcomes we’ve achieved, the lessons we’ve learned, and the next steps we have to pursue are all the result of the collective efforts of these Consortium’s community.” The next step? Creating the National Strategic Computing Reserve to help the world be better prepared for future global emergencies.

Continue reading

Simplifying data: IBM’s AutoAI automates time series forecasting

In our recent paper “AutoAI-TS: AutoAI for Time Series Forecasting,” which we’ll present at ACM SIGMOD 2021, AutoAI Time Series for Watson Studio incorporates the best-performing models from all possible classes — as often there is no single technique that performs best across all datasets.

Continue reading