AI Assistants

Putting “human” into the driving experience with AI assistants: part 2

Share this post:

Part 2: How should automakers think about engagement?

Welcome to part two of the series. Today, I want to continue the conversations and focus on “engagement.” The quick answer is that “engagement” means “personalized.” There won’t be one without the other. The trick for AI assistants is to be able to effectively interact with users without being annoying or boring. Because that’s the goal, there are several aspects to this that are critical.

First, there’s no way to personalize assistants without knowing the driver. At every stage, the interaction has to be about helping customers with their needs. From that, loyalty and trust will follow. Given that trust, there’s a lot that can be gleaned from calendars, routes and social profiles to better understand driver preferences. Further, AI Assistants need to learn from their own data. Engagement models cannot be static as the manner of interaction can be improved through the history of its usage.

Four types of engagement

We categorize four types of engagement through a progressive spectrum of greater personalization. They are:

  • Responsive: activated through command and control from users. This is the most basic initial engagement with which most digital assistants are equipped.
  • Proactive: anticipate driver needs from usage patterns and preferences. Vehicles engage directly through voice or screen display and can respond automatically
  • Instinctive: recognize driver needs from usage patterns and real-time events, enhancing the experience with the vehicle by adjusting its responses and automated actions
  • Explorative: engage to explore. Vehicles recognize usage patterns and context to probe through natural conversation helping users determine what they want.

Next, when we move beyond the responsive model, greater care must be taken to keep the interactions at an appropriate level so they don’t become overwhelming and are truly serving the needs of the driver and occupants.

Finally, conversation with a vehicle must be natural and hold the user’s attention. IBM Research is developing a Natural Language Framework on which Watson Assistant’s capability will be based. The conversational thread across multi-part exchange must be kept contiguous.   Each question or comment to the assistant cannot stand on its own in isolation. It has to work just like it would if you were speaking to another person. Historically, this hasn’t been the case, and it’s one of the most profound points of frustration with users that causes them to disengage.

Multi-modality extends beyond just voice

Voice interactions are just a start. Drivers in vehicles and the capabilities of cars provide so much more data to enrich the overall experience and keep people safer. This information can be looked at holistically to provide new avenues for engagement.

Let’s start with the data given off by drivers. Tone of voice, hand gestures, facial expressions and other audible utterances all give a sense of what is preoccupying drivers. Every day, people get into cars while they’re emotional and drive off. And over 94 percent of accidents can be traced back to some manner of driver error. Drivers are ten times more likely to get into an accident when they’re emotional. Given that, shouldn’t your car know how you’re feeling?

 

 

Increasingly, cars are putting a camera on drivers to gauge their facial expressions. Are people happy, upset, looking at their phone or falling asleep? The response to each of these circumstances must be highly personalized and relevant because the wrong response can exacerbate the problem. For example, providing someone a “happy” response to a person who is upset, may make them even more upset.

Why “how” you say it matters, too

Microphones in cars also can detect a tone of voice. It’s not hard to tell how someone is feeling by listening to them speak. Again, the right response is necessary when assembling information across multiple modes.

Furthermore, the car itself provides relevant information that helps formulate the right response. Cars can know the route, how much traffic they’re in and how they’re being driven. Swerving, sudden acceleration and braking all help determine the levels of potential danger that need response.

An AI assistant can help bring the features of a connected car to life for its driver. A comprehensive connected vehicle that is highly personalized with driver preferences, voice, multiple modes and vehicle IoT systems can elevate the in-vehicle experience for your customers. To deliver an absolutely differentiated vehicle, manufacturers must understand drivers, occupants, and the surrounding environment. An AI assistant can help bring the features of a connected car to life for its driver.

Join us at TU West Coast to learn more

Please join me at #TUWestCoast on Oct. 4. I’ll be on a panel talking about this very subject, “Voice & AI: Offering the Personalized User Experience.” You can also join my colleague, Rajiv Phougat for his panel on Oct. 3: “AI in the Automotive Value Chain, Inside and Outside the Car.”

You can also learn how to use AI and IoT to give your brand a voice.

 

Offering Manager, IBM Watson Assistant for Connected Vehicles

More AI Assistants stories
By Andy Barnes on October 10, 2018

60 years of innovation, and cars that know you

Sixty years ago, IBM created the Hursley UK Lab campuses. And thus began six decades of relentless development and innovation. To celebrate this Diamond Jubilee, last month we hosted a Festival of Innovation. It was our chance to show the world where IBM develops the software that keeps the modern world running. Members of the UK […]

Continue reading

By Ryan Boyles on October 2, 2018

QuickBYTES interview with Amy Silberbauer: What is Agile Engineering?

Ahead of the Watson IoT Agile Engineering Summit 2018, to be held October 15-17 in Washington DC, we asked our speakers for some thoughts about systems engineering and agile practices. I met Amy Silberbauer, our first speaker, to ask for her take on agile. ‘Agile is not your grandmother’s agile anymore’ – Amy Silberbauer, IBM […]

Continue reading

By Steve Shoaf on September 26, 2018

The top three reasons to attend IBM’s Agile Engineering Summit

Calling all developers, architects, integrators and testers! IBM’s Agile Engineering Summit is coming to Washington DC, and we’d love to see you there. From October 15 – 17 we’ll be delivering hands-on labs and in-depth technical talks with everything you need to know about the latest technologies. Here are the top three reasons to attend. […]

Continue reading