AI Assistants

Putting “human” into the driving experience with AI assistants: part 2

Share this post:

Part 2: How should automakers think about engagement?

Welcome to part two of the series. Today, I want to continue the conversations and focus on “engagement.” The quick answer is that “engagement” means “personalized.” There won’t be one without the other. The trick for AI assistants is to be able to effectively interact with users without being annoying or boring. Because that’s the goal, there are several aspects to this that are critical.

First, there’s no way to personalize assistants without knowing the driver. At every stage, the interaction has to be about helping customers with their needs. From that, loyalty and trust will follow. Given that trust, there’s a lot that can be gleaned from calendars, routes and social profiles to better understand driver preferences. Further, AI Assistants need to learn from their own data. Engagement models cannot be static as the manner of interaction can be improved through the history of its usage.

Four types of engagement

We categorize four types of engagement through a progressive spectrum of greater personalization. They are:

  • Responsive: activated through command and control from users. This is the most basic initial engagement with which most digital assistants are equipped.
  • Proactive: anticipate driver needs from usage patterns and preferences. Vehicles engage directly through voice or screen display and can respond automatically
  • Instinctive: recognize driver needs from usage patterns and real-time events, enhancing the experience with the vehicle by adjusting its responses and automated actions
  • Explorative: engage to explore. Vehicles recognize usage patterns and context to probe through natural conversation helping users determine what they want.

Next, when we move beyond the responsive model, greater care must be taken to keep the interactions at an appropriate level so they don’t become overwhelming and are truly serving the needs of the driver and occupants.

Finally, conversation with a vehicle must be natural and hold the user’s attention. IBM Research is developing a Natural Language Framework on which Watson Assistant’s capability will be based. The conversational thread across multi-part exchange must be kept contiguous.   Each question or comment to the assistant cannot stand on its own in isolation. It has to work just like it would if you were speaking to another person. Historically, this hasn’t been the case, and it’s one of the most profound points of frustration with users that causes them to disengage.

Multi-modality extends beyond just voice

Voice interactions are just a start. Drivers in vehicles and the capabilities of cars provide so much more data to enrich the overall experience and keep people safer. This information can be looked at holistically to provide new avenues for engagement.

Let’s start with the data given off by drivers. Tone of voice, hand gestures, facial expressions and other audible utterances all give a sense of what is preoccupying drivers. Every day, people get into cars while they’re emotional and drive off. And over 94 percent of accidents can be traced back to some manner of driver error. Drivers are ten times more likely to get into an accident when they’re emotional. Given that, shouldn’t your car know how you’re feeling?

 

 

Increasingly, cars are putting a camera on drivers to gauge their facial expressions. Are people happy, upset, looking at their phone or falling asleep? The response to each of these circumstances must be highly personalized and relevant because the wrong response can exacerbate the problem. For example, providing someone a “happy” response to a person who is upset, may make them even more upset.

Why “how” you say it matters, too

Microphones in cars also can detect a tone of voice. It’s not hard to tell how someone is feeling by listening to them speak. Again, the right response is necessary when assembling information across multiple modes.

Furthermore, the car itself provides relevant information that helps formulate the right response. Cars can know the route, how much traffic they’re in and how they’re being driven. Swerving, sudden acceleration and braking all help determine the levels of potential danger that need response.

An AI assistant can help bring the features of a connected car to life for its driver. A comprehensive connected vehicle that is highly personalized with driver preferences, voice, multiple modes and vehicle IoT systems can elevate the in-vehicle experience for your customers. To deliver an absolutely differentiated vehicle, manufacturers must understand drivers, occupants, and the surrounding environment. An AI assistant can help bring the features of a connected car to life for its driver.

Join us at TU West Coast to learn more

Please join me at #TUWestCoast on Oct. 4. I’ll be on a panel talking about this very subject, “Voice & AI: Offering the Personalized User Experience.” You can also join my colleague, Rajiv Phougat for his panel on Oct. 3: “AI in the Automotive Value Chain, Inside and Outside the Car.”

You can also learn how to use AI and IoT to give your brand a voice.

 

More AI Assistants stories
By Kal Gyimesi on September 26, 2018

Putting “human” into the driving experience with AI assistants: part 1

The era of voice has finally taken off in vehicles. Accelerating this trend is the proliferation of consumer-brand, AI-powered digital assistants. However, I expect the future growth in this category will be driven more by enterprise-grade assistants. These assistant will be embedded in every type of connected device, including, of course, our cars. Ultimately, I […]

Continue reading

By Chris O'Connor on August 20, 2018

Welcome to the new world of experiences through voice assistance

Remember the old series running in the 1980s called Knight Rider? KITT the talking car, anticipating its driver’s needs, guarding him from dangers, sometimes driving the car for him and equipping him with the information he needed to make the right decisions? That is no longer just fiction. It leapt from our TV sets directly […]

Continue reading

By Kal Gyimesi on August 16, 2018

110 years later, a conversational assistant is another step in automotive innovation

110 years ago, the first Ford Model T was assembled. In 1908, automobiles had already been around for decades. Yet they were still a novelty; expensive and out of reach for most Americans. But that all that changed when the first Ford Model T was assembled in Detroit, Michigan. It was a car built for […]

Continue reading