Share this post:
Millimeter-wave radar imaging experimental set-up (
I have been an electronics enthusiast ever since I was in elementary school. To put together an electronic device that interacts with the physical world in some way has been my passion and I still remember the excitement I felt when I built my first circuit in 6th grade – even though it was simply something that periodically turned an LED on and off.
After earning an undergraduate degree in electronic systems engineering in my home country of Mexico, I came to the U.S. to study for a PhD in Electrical Engineering, before joining IBM in 2006 to work on silicon integrated millimeter wave circuits and systems. I had the honor of joining a team of IBM scientists who were pioneers of the first monolithic millimeter wave radio that exploited portions of the radio spectrum to boost wireless communications. And since then I have been researching how to engineer more and more complex millimeter wave systems. At IBM Research I have also had the opportunity to collaborate on multi-disciplinary endeavors such as developing the first graphene-based RF circuits and participating in wireless standardization committees and University-Industry research programs.
Millimeter waves are part of the electromagnetic spectrum, similar to light and X-rays, but with much longer wavelengths. They are a band of radio spectrum between ~30 GHz and 300 GHz that can be used for high-speed wireless communications and could become part of 5G. One aspect of our research explores how we can use compact millimeter wave systems for imaging.
We are on a quest to build a new type of imaging technology that uses separate portions of the electromagnetic spectrum to make the invisible, visible. More than 99.9 percent of the electromagnetic environment we live in cannot be observed by the naked eye. Over the last 100 years, scientists have built instruments that can emit and sense energy at different wavelengths, and today we rely on some of these to take medical images of our body, see the cavity inside our tooth, check our bags at the airport, or land a plane in fog. All these tools can illuminate objects and opaque environmental conditions using different frequencies of the electromagnetic spectrum such as radio waves, microwaves, millimeter waves, infrared and x-rays, and reflect them back to us in the form of an identifiable image. However, these instruments are incredibly specialized and expensive and only see across specific portions of the electromagnetic spectrum.
We are building a portable hyperimaging platform that “sees” across separate portions of the electromagnetic spectrum in one platform to potentially enable a host of practical and affordable devices and applications that are part of our everyday experiences. We anticipate that the ability to leverage information from two or more separate portions of the spectrum will tell us a lot more about objects in the world around us.
This effort is one initiative of the IBM Research Frontiers Institute, a consortium built on open and collaborative research in which member companies from diverse industries will leverage IBM’s research talent and cutting-edge infrastructure to spur world-changing innovations with global impact.
What is our prediction?
In five years, emerging portable imaging devices will help us see beyond the domain of visible light to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view.
Why will this change the world?
Our ability to “see” beyond visible light will reveal new insights that help us understand the world around us. This technology will be widely available throughout our daily lives, giving us the ability to perceive or see through objects and opaque environmental conditions anytime, anywhere.
A view of the invisible or vaguely visible physical phenomena all around us could help make road and traffic conditions clearer for drivers and self-driving cars. For example, using millimeter wave imaging, a camera and other electromagnetic sensors, hyperimaging technology could help a car see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice or tell us if there is some object up ahead, as well as its distance and size. Cognitive computing technologies will reason about this data and recognize what might be a tipped over garbage can versus a deer crossing the road, or a pot hole that could result in a flat tire.
What are the underlying technologies?
A blend of high-performance, compact sensors packed into a single platform will capture different non-visible properties of an object. One of the sensors uses a silicon chip and array of antennas to form and electronically steer a beam of millimeter wave energy to precisely capture the object’s distance, location and reflectivity. When combined together with machine intelligence technologies, these new imaging devices will recognize and reason about invisible objects, properties or situations to inform us and warn us about what may be hiding out of sight.
Xiaoxiong Gu, Alberto Valdes-Garcia, Arun Natarajan, Bodhisatwa Sadhu, Duixian Liu, and Scott K. Reynolds, “W-Band Scalable Phased Arrays for Imaging and Communications.” IEEE Communications Magazine.
Read all of IBM’s 2016 technology predictions at IBM 5 in 5.