August 11, 2017 | Written by: Cole Stryker
Categorized: New Thinking
Share this post:
Companies can learn a lot about you by examining how you interact with a simple banner ad. They know where you came from and how long you’ve spent scrolling. With a pre-roll video spot, they can determine more. Now that brands are dipping their toes into VR experiences, they are poised to unleash rivers of behavioral data.
Imagine you are in a VR experience and presented with an array of microwavable dinners. One of these, a chicken carbonara, looks especially tasty. You lock your gaze onto the meal. A motion sensor affixed to your headset measures the movement of your head, if not your retina. This technology creates heat maps which indicate items or areas within the experience that users tend to look at the most. Analysts can use this data to tweak the experience and make them more enjoyable for the user, or to reposition, say, product placements, so that they’ll be easily viewed within the experience.
Or maybe you’re viewing a test screening of an upcoming virtual ad campaign. An analyst might look at various sensors that could monitor subtle muscle movements to determine when you get bored and lose interest. Whether specific imagery makes you laugh or excites you or makes you anxious. Imagine a political campaign testing speeches on an audience wearing sensor-fitted headsets which feed data back to researchers to let them know how specific messaging performed.
We spoke with Rob Merki of CognitiveVR about how they are approaching analytics in the virtual, augmented, and mixed reality spaces. CognitiveVR initially started with mobile analytics technology and added a layer of VR-specific metrics. But although they had a few happy customers with that approach, they eventually decided to shift away from 360° video into the more immersive 3D world of VR.
Rob’s team built a product called Scene Explorer, which compiles all the user data drawn from a VR experience. These metrics, for now, are built on three components of user data: positional tracking (where the user moves around in the virtual space), gaze tracking (where the user looks, typically determined by the direction of the headset), and controller engagement (button presses, or other tactile manipulation of a controller). Their platform also supports third-party data from biometric sensors. Theoretically such sensors could provide user data on body temperature, heart rate, and other clues that give analysts an idea of the user’s emotional state.
VR experiences are much more complex than even 360° video. Web and mobile environments restrict users to certain inputs. For example, in a browser environment with a computer mouse, your choices are to either click on an ad or to not click on it. “Because you are forced into that abstraction,” says Rob, “you’re forced into behaving a certain way, which allows the analytics to make very obvious deductions based on your activity.” By removing that binary abstraction, VR designers have introduced a difficult challenge for the analytics.
But emotional analysis is a more difficult nut to crack.
“There are a lot of people who assume you can just start directly tracking emotion right away with our current technology. I think that’s frankly bull****. So if I’m watching VR, there’s no way to tell I’m happy unless I tell you…Everybody’s different, so me acting a certain way is different than someone else acting a certain way. I have a different body.”
Rob defines this as a misconception—that there is a one-size-fits all way to track people, and once we crack the code of what it looks like to be happy in VR, then we can just apply that formula in a blanket way, and the machine will spit out a tidy list of people who were happy, and those who were sad, during their VR experience. He describes this as an exponentially difficult challenge, to understand how people react emotionally, especially at the moment when our primary sources of input data are position, rotation, and interactions with a controller. He expects us to get there within the next five years, but the sheer complexity of how different people react to the same circumstances presents a big hurdle for the medium’s analysts.
When VR systems misjudge a user’s emotional state is when the experience will fall apart. A classic use case for emotional feedback in a VR experience entails a system determining that a user is bored, and then hitting them with a jolt within the experience: like a snake or a monster. But this concept ignores the variability in user behaviors and profiles. Some people are healthier than others and will have lower heart rates. Some might be deathly afraid of a VR experience, but their bodies won’t respond as readily to that fear with biometric data shifts as other users. Getting this wrong breaks the immersion or even the narrative clarity of an experience, and could leave users disappointed or confused. Inevitably though, biometric analysis is becoming more sophisticated, and data sets are increasing. Perhaps a huge company with a capability to collect lots of user data—a Google or a Facebook or an Amazon, will be required to take this data analysis to the next level.
Rob suggests that one next phase will be retinal tracking. For now, most VR headsets aren’t paying attention to how users move their eyes—only their heads. But retinal tracking is an inevitable outcome as users demand more immersive experiences. The technology already exists. HTC is offering the technology as an add-on device for their Vive headsets, for only $220.
Brands and advertisers are looking to create new VR experiences that are essentially branded content, and they’re also figuring out ways to insert their brands into existing VR experiences. Google has envisioned their “ad cube,” which displays videos when a user engages with a floating cube within a VR experience. I asked Rob what challenges we can expect on that front. He suggested that we need to think bigger than product placement. He tells me about a UNITY (one of the most popular engines for creating VR experiences) coffee shop that he once visited within VR.
“You’re in the experience, and you don’t just see a [drink] on the countertop. You actually go inside another world, like a Starbucks coffee shop… You’re in this nice coffee shop and there’s a nice latte on the counter…It entices you into the environment where you’re gonna be spending money, as opposed to just showing you a logo.”
VR is still in very much a wild west phase, so it’s difficult to cast too far into the future. VR developers need to make money to create their virtual worlds. Advertisers are already responding to this demand with a variety of solutions, and analytics platforms like CognitiveVR’s Scene Explorer will continue to evolve to match the expanding sophistication of the medium as it develops.