A variety of plankton seen through IBM’s autonomous microscope.
Imagine the ability to crowd source environmental protection. A world where we could watch how tiny organisms like plankton move, the tiny creatures that supply two-thirds of the oxygen we breathe. Working together, we have invented a microscopic reality system that enables scientists to see life on a scale where a human hair is as big as a tree, which can be combined with AI to detect plankton’s behavior, to predict the health of the environment.
We are excited to unveil a new ecosystem of smart microscopes, including the microscopic reality system, at the IBM Think conference as part of IBM Research’s “5 in 5” technology predictions. We believe the invention’s future is a small autonomous microscope that could be placed in bodies of water to create 3D models of plankton and track their behavior in their natural environment. This could help in situations including oil spills and to predict threats such as red tides. We envision a network of these devices, to monitor the health of the environment, using plankton as the sensor. By embedding AI in the microscopes, their shape, size and behavior can be analyzed to determine their health. Collecting and sharing this information in the cloud can give us a tremendous insight into the health and operation of the ocean’s complex ecosystem.
Simone Bianco in the lab with the autonomous AI microscope designed to continually monitor in real time the health of our water (Tony Avelar/Feature Photo Service for IBM)
Simone: We began working together through my work in cellular engineering. One of my outstanding research problems was to track the movement of plankton in 3D fast enough so I could mathematically model changes in shape in behavior. With these models I could use plankton as chemical and environmental sensors, like a canary in a coal mine. The problem is with conventional microscopes moving plankton go in and out of focus. Enter Tom! A couple of years ago, he gave a talk on a 3D lensless microscope he invented. I was in the audience and realized his microscope could solve my big problem. Tom’s microscope has no lens, so no focusing is required. It relies on an imager chip, like the one in cell phone cameras, to capture the shadows of plankton as they swim on top of the imager chip. This is the imaging system used in the microscopic reality system.
Tom Zimmerman (Tony Avelar/Feature Photo Service for IBM)
Tom: Partnering with Simone let me bring together my interests in the environment, image processing, inventing and getting kids excited about science and engineering. Working with my mentor Barton Smith, I initially invented the 3D lensless microscope to look at the tiny creatures in my fish tank. I quickly became more interested in plankton than my fish. Before I met Simone, I had no idea that half a billion years ago plankton tripled the oxygen in the air, which led to an explosion of life – and to us. Now that we can explore the microscopic world in high definition, I want to share with young people the amazing world of “life in a drop of water”, and teach them how vital these little creatures are to our life and our future.
Engineers and scientists have so much to learn from nature. Biology is a profound motivator and teacher. The brain shows us what’s possible with AI. Cells show us what’s possible in chemical synthesis, sensing, data storage and learning. Cellular shape and behavior indicate their health and the health of their environment. Our challenge is to learn how to interpret and understand the signals, chemistry and structure nature uses to control, modify and create life.
Our vision for microscopic reality is to enable us to detect in real-time the interaction and subtle changes in shape and behavior of the many organisms that populate our environment.By combining microscopes with VR, AR and AI, we hope to gain a better understanding of how the ecosystem operates, to better manage the health of the environment.
We have developed an AI-driven assistive smartphone app dubbed LineChaser, presented at CHI 2021, that navigates a blind or visually impaired person to the end of a line. It also continuously reports the distance and direction to the last person in the line, so that the blind user can follow them easily.
To tackle bias in AI, our IBM Research team in collaboration with the University of Michigan has developed practical procedures and tools to help machine learning and AI achieve Individual Fairness. The key idea of Individual Fairness is to treat similar individuals well, similarly, to achieve fairness for everyone.