Immersive analytics: the reality of IoT and digital twin
In Internet of Things (IoT), I’ve come across the phrase ‘immersive analytics’ actively used by analytics vendors to describe the virtual reality (VR), augmented reality (AR) and other new display technologies to support analytical reasoning of sensor data. While the analytic vendors hope to add an interactive view, they don’t understand the elements of an asset, its lifespan, or its relationship in system-of-systems. Immersive analytics vendors can fall short of what’s required to fully, and digitally, represent ‘things’; it’s far more than VR glasses or large flat screen displays. Businesses and their engineers see digital twin as the complete bridge between physical and digital worlds.
The digital twin is the virtual representation of a physical object or system across its life-cycle (design, build, operate) using real-time operational data and other sources to enable understanding, learning, reasoning, and dynamically re-calibrating for improved decision making. The physical object is anything from a building to a ball-bearing, while the system is electrical, mechanical or software and, more importantly, the interoperability (i.e. systems-of-systems).
Life cycle, not goggles
The common approach of immersive analytics is presenting a view of products in-operation, missing all the aspects or comparisons of the product as-designed, as-built, and the collaboration and communication between product stakeholders. There are engineers involved throughout a life-cycle (design, build, operate) and each identifies a component differently. It also requires a platform to align the definitions of the virtual object, to bring operational/physical data into the digital twin, which allow understanding of how a product is performing compared to its as-designed intent (for example).
While an immersive analytics vendor might provide a virtual image of an asset in use, along with some prediction of remaining life/use, it’s not ‘inclusive analytics.’ They cannot connect back to product simulations, design models, and a lifespan predicted in design and test phases. What if the immersive analytics show that a newly installed component will fail in 3 years, although the design engineer’s intent was for it to have a lifespan of 5 years? Does this failure come from work done by the designer, builder, or serviceman? Immersive analytics cannot tell you that. It can only predict the need to replace that component again and again.
Analytics near, not far
The approach of immersive analytics vendors often involves collecting and sending all relevant data for rendering a virtual image of a physical asset. If you’re not in IT, you might not recognize the flood of structured and unstructured data this creates. What you’re likely sending back is all the data from far away devices when you only need the data sent at certain increments or once a certain tolerance has been exceeded.
Just one connected car can send 25GB of data back to the cloud every hour. What you need is analytics-at-the-edge or running the data through an analytics algorithm as it’s created, at the edge of a corporate network. Companies can set parameters on what information is worth sending to a cloud or on-premises data store for later use — and what isn’t. Using analytics and/or cognition to recognize and deliver relevant data from devices, BEFORE it jams the network, is not an IoT capability found in most immersive analytic vendors.
More analytic options
The approach of immersive analytics is simplistic visuals through VR. Users are seeing limited modality of a device such as its temperature, time in operation, or location. As any engineer will tell you, there is far more required to ‘seeing’ the quality and performance of an asset through the test, build and operate stages. Beyond devices that collect temperature or time, a digital twin identifies what a good part should look and sound like.
IBM Watson IoT launched Cognitive Visual Inspection last year to allow firms to connect HD cameras to see details of objects and predict quality. The solution uses IBM Watson to review and analyze parts, components and products, identifying defects by matching patterns to images of defects that it has previously encountered, analyzed and classified. If you want to determine performance of a far away asset in operation, you can detect quality by sound (i.e. vibration or acoustical).
Hanging on digital thread
The approach of immersive analytics vendors ignores the requirement of building a digital thread and therefore creating another silo of users (easily identified as those wearing googles). A digital thread is the framework that connects engineering teams and their data, allowing an integrated view of an asset throughout its life-cycle; it allow multiple views to be created for different stakeholders. Without that, how are the designers or factory engineers supposed to learn of product use, component failures, or overall user experience? Immersive analytics has limited benefit if there are no connections through a single source of component definitions, product relationships, sensors, and suppliers.
Digital twin reality
There is no question that VR is useful technology when applied correctly. IBM has relationships with VR vendors such as Daqri who enable companies to see deeply into machines, systems and spaces. To visually and remotely understand a components status is useful but when disconnected from the entire product life-cycle, immersive analytics leaves engineering staff no better off. Digital twin solutions offer more value because:
- People across the organization have the exact digital twin view they need of a product at every stage in its life cycle;
- Each stakeholder group is contributing its set of data and applications, allowing teams to talk to each other;
- Analytics are applied to data streams and every stage of the product life-cycle.
To learn more, visit ibm.co/digitaltwin