The International Semantic Web Conference (ISWC) 2020, the premier international forum for the Semantic Web and Linked Data Community, is being held November 1 - 6, 2020. IBM Research AI is proud to participate in this conference as a platinum sponsor.
Published in our recent ICASSP 2020 paper in which we successfully shorten the training time on the 2000-hour Switchboard dataset, which is one of the largest public ASR benchmarks, from over a week to less than two hours on a 128-GPU IBM high-performance computing cluster. To the best of our knowledge, this is the fastest training time recorded on this dataset.
To assist in the fight against the COVID-19 pandemic, prominent research institutes led by Allen Institute for AI (AI2) released earlier this year the COVID-19 Open Research Dataset (CORD-19). Comprised of scientific articles related to COVID-19, Sars-Cov-2, and related coronaviruses, the dataset (which at the time of writing this contains more than 75,000 full text scientific papers) is […]
The 58th Annual Meeting of the Association for Computational Linguistics (ACL 2020), the premiere annual conference on AI and language, takes place July 5-10. As is the case with most events currently, ACL will be virtual this year due to COVID-19. At IBM Research AI, we’re excited to share with you — wherever you might be in the world — all the work […]
The field of Natural Language Processing (NLP) has made large strides over the last decade. In fact, NLP is so common in today’s AI applications that whether consumers are communicating with a virtual assistant, asking for travel directions or searching for weather predictions, chances are they’re interacting with some form of NLP. This technology, however, […]
IBM Research AI is leading the push to develop new tools that enable AI to process and understand natural language. Our goal: empower enterprises to deploy and scale sophisticated AI systems that leverage natural language processing (NLP) with greater accuracy and efficiency, while requiring less data and human supervision.
Convex optimization problems, which involve the minimization of a convex function over a convex set, can be approximated in theory to any fixed precision in polynomial time. However, practical algorithms are known only for special cases. An important question is whether it is possible to develop algorithms for a broader subset of convex optimization problems that are efficient in both theory and practice.
At ACL 2019, IBM researchers released a paper detailing the model they trained to answer complex questions using Neural Program Induction, which allows an AI model can be taught to procedurally decompose a complex task into a program.
IBM Fellow Salim Roukos provides some specifics on IBM Research’s enterprise NLP work by highlighting four papers IBM Research AI is presenting at the ACL 2019 conference.
Language-independent entity linking makes sense of any text on the web Let’s take a hypothetical sports headline: Philadelphia Beats New York in 34-26 Nail Biter! For applications like search or question answering, we’d like today’s computers to analyze this piece of text (based on the context, of course) and make sense out of the “entities” […]
Koichi Kamijo, a scientist at IBM’s research lab in Tokyo was only 20 questions into a 300-question life insurance form. And he was already exhausted. There had to be an easier way to answer these questions. So, he got together with his colleagues, Ryo Kawahara, Takayuki Osogami, Masaki Ono and Shunichi Amano to come up […]