Cars that care, chapter 3: The remedy – understanding drivers

Share this post:

While we anticipate autonomous cars taking over on-road decision making, some technical and legal challenges may be more difficult to solve than expected. Self-driving cars will be available in about a decade, but people will continue driving for many years to come. Drivers remain a critical component of road safety.

Increasingly, the overall experience in vehicles will be defined by the car’s digital capabilities. Understanding the driver and vehicle’s occupants is a critical ingredient to a differentiating personalization strategy. Further, active safety measures are helpful, but limited as they rely mainly on vehicle technology that is independent of understanding who the driver is. We expect that drivers will opt-in to these technologies as they are demonstrated to heighten the experience and improve safety. How cars interact and respond to their users becomes a usage criteria when selecting which vehicles people will select.

Driver Behavior Triggers Safety Alerts

We can start by considering safety where driver behavior acts as the initial trigger. Many people get in their cars and drive off every day without a second thought. Often, they have other things on their mind. Paying attention to the road and surrounding traffic isn’t always where their head is at. They may be excited, upset, angry or anywhere in between. Drivers are 10 times more likely to get into an accident when emotional.

IoT systems in vehicles contrast driving in real-time to historical norms and flag behavior that is erratic. From there, we need to diagnose what is going on with the driver and provide relevant remedies to keep everyone safe. Information about the driver can be obtained from several sources; the vehicle sensors, other IoT sensors that car interacts with, directly from drivers and from other contextual information brought in through the car’s connection to the cloud.

Understanding Drivers

Beyond directly monitoring driver behavior, sensors in vehicles can also help understand the driver’s frame of mind. For example, some automakers are investing in personal health monitoring systems in vehicles that can track electrocardiography through the seat sensors or heart rate through the steering wheel among other biometrics. Cars can also track secondary indicators such as the force of the car door being closed, the volume of the radio and how many others are in the car with the driver as clues to what may be going on.

Vehicles are increasingly tapping into other information sources as they connect to the Internet of Things. This includes information from other cars, the surrounding traffic infrastructure or from wearables the driver may bring into the car with them. Wearables can monitor drivers more directly than in-vehicle sensors that could miss red flags if a driver is wearing gloves or a heavy coat in the winter.  Monitoring how heart rate or other vital signs are changing as someone is driving can be an indicator of a change in emotions.

The best information will be taken directly from the driver while in the car. Cameras on the driver can read facial expressions and interpret hand gestures through image and video recognition. Microphones inside the car can monitor the content, tone and volume of voices. Video, image and audio recognition go a long way to understanding the driver’s frame of mind when a car is being driven erratically.

An overview of IBM Intelligent Analytics

Finally, contextual information can both narrow specific emotional cause further and provide clues to appropriate remedies to help keep the car’s occupants safe. Contextual data can be categorized as information that is driver specific, such as access to who the driver is calling, social media profiles or media content that may be playing or it can be general location information such as weather, traffic or accident rates in the area that the car is moving. The more information that drivers are willing to share, the richer the experience cars that care can provide. For example, if we understand some of the driver’s most recent posts on their social sites that could certainly help understand their emotions.

Adjusting the Response

As we learn more about a specific driver we can build information that helps understand a given driver’s personality. We gain insight on personality and continue to tune it over time with more information, responses to errant driving can be steadily improved.

What Watson sees in your personality

Ultimately the goal is to present the right remedies to help keep drivers safe. Premium vehicle already offer lane departure warning and adaptive cruise control is moving down the cost curve. Cars are also giving more haptic warnings to drivers.  This can soon be accompanied with voice. But can voice interaction be adapted to the driver’s personality rather than just generic?  What if it was accompanied by a digital avatar in the heads-up display? Cars can even aid in the selection of content to help address a driver’s emotions and even adjust in-vehicle lighting or other aesthetics to help mitigate emotional driving.

Interactions could get even more sophisticated as AI systems learn more about the driver. Cars can ask if they could call a friend and know who its best to call. They could also make modifications to the driver’s planned destination by suggesting a stop at nearby friends home or make ride-share reservations to defuse an emotional situation.

Cars that care infographic chapter 3 the remedy

Cars that care remedy infographic – click to view full size.

Cars that care can not only help keep drivers safe but deepen their overall experience in their vehicles. The combination of IoT systems understanding the car and how it is being driven coupled with cognitive systems like Watson understanding the people in the car opens endless possibilities

Your car can keep you safer if you let it get to know you!

I talked in chapter 1, about the problem, and in chapter 2 about the diagnosis.

More Automotive stories

Stop defects at the point of installation

Written by Kristi Toole | August 4, 2020 | Asset Management, Automotive, Manufacturing

We recently talked to John Ward, IBM Solution Leader for Global Automotive and Aerospace & Defense Industries, to learn about how companies can improve product quality and reduce costs with IBM Maximo Visual Inspection. What is IBM Maximo Visual Inspection? IBM Maximo Visual Inspection detects and identifies objects in images and videos to stop defects more