Industry Insights

Build your own wonder

Wonder-ing out of the uncanny valley

The “uncanny valley” is a phrase used to describe machines or systems that at the same time appear close to human, but elect a sense of revulsion rather than wonder.  Creepy is another word that gets used.  It is an old idea (first used by a Japanese roboticist working on human machine emotional responses in the 70’s), which my colleague Eva has illustrated below:
wonders
Usually it is applied to human looking robots, but the latest chatbots (take a look at the Watson personal assistant as an example) can evoke the same feeling of “uncanniness” too when they lurk behind a call center or interact with voice.  They are almost human, but not quite.  
 
The uncanny valley also represents a violation for human norms for some, or a kind of “de-humanization” that represents AI at its worst. Companies want brand love rather than brand hate, so understanding how to avoid the uncanny valley is very important when using AI during consumer journeys (e.g. on carelines).

Bring on the human

wonderSteve Ingram, a Senior Consultant in IBMs Change & Talent practice, has some interesting ideas on this theme of de-humanization (see his diagram to the right) and how machines can help:
What started you thinking about the relationship between AI and society?
 There were several things, but mainly just how frustrated I’ve become with the lonely experience of internet shopping and the dissatisfaction I have felt when ending some phone calls with a helpline.  
 
You talk about “re-humanizing” society.  What does that mean?
 I visit some small independent shops. The owner serves me based on our relationship and also their knowledge of products and the other customers’ experiences. I leave the shop feeling ‘loved’ and satisfied. 
 
When I call a helpline or visit high street shops, my experiences are very hit-and-miss, often feeling rather disappointed and insignificant. By “re-humanized” society I’m referring to the ability to regain a personalized experience through use of AI so consumers will remember the feeling of having one person looking after their needs.  Examples are:
 
·     Providing an opportunity to speak with the same ‘voice’ every time you call a helpline
·     Ensuring that voice is able to recall your previous conversations and the moods during those conversations and 
·     Being able to harness the experience of interacting with thousands of other consumers to better serve your individual needs
 
Can you give some examples of how AI can re-humanize work and society?
 There are many examples of how machines are starting to re-humanize society. In particular, IBM’s Watson is already demonstrating how a robot can have coherent conversations with a human to serve customers. 
 
What should people do now?
 Embrace this technology and enjoy the benefits it provides.
 
Why am I telling you this? For me, contemporary AI solutions need to interact with people well enough that people feel empathy.  In a world where machines augment us (rather than replace us), people need to want to work with the robot (hardware or not).  The software needs to have some degree of personality and provide perceptual cues that we can respond to naturally.

 

How do you make AI wonderful?

wonders
Let’s imagine you are setting out to make an AI application today – hopefully one that doesn’t fall into the uncanny valley (well, not too far anyway).  What are the building blocks?
 

Natural language

First, the systems must understand or analyze natural language, and be able to generate it. After all most systems need to interact with human beings. Being able to have a question and answer dialogue is likely to become one of the commonest user interfaces as you can see from the rise in chat bots.  You might be surprised to hear that the first serious attempt at machine human interaction was in 1964, when Joseph advising Baum at MIT created Eliza, a fairly convincing early chat bot

Machine learning

Machine learning, of course, is a key factor.  Machine learning in all its forms has delivered much of the progress we associated with AI to date.  I think it will still take another 3 to 5 years before we really understand the true business impact of the tools we have today, but I expect these techniques to appear everywhere in business in the next 20 years.  Part of my confidence comes from the availability of powerful open source libraries such as TensorFlow and Apache ML, and the popularity of programming languages such as Python, R and Scala.
 

Pattern recognition

Pattern recognition is another key building block as it enables us to find examples of normality and exception in datasets, and without which it would not be possible to have self driving cars, automatic language translation, and many other common applications.  Many business systems are process and rule-based, and pattern recognition techniques have the potential to reduce the workload in keeping those systems up to date.
 

Knowledge representation

Another building block is the ability to represent human semantics such as abstract concepts, tacit knowledge and logic in ways that machines can process. Here computer scientist talk about ontologies which is just a fancy way of describing information domains and the relationships between them.
 

All that stuff we know is important but never get round to

My final building block for AI are capabilities associated with scheduling and planning decisions. Think about asking Siri or Alexa to schedule an appointment, or a drone to work out the best way to fertilize a field…or any practical business application of artificial intelligence, and there is likely to be one or both.

Tool toys

wonderLet me share a practical example of these building blocks in action today. This is Olli which is a self-driving minibus with a human touch developed by IBM and Local Motors. Notice how Olli looks like an overgrown children’s toy?  This is a well trodden strategy to avoid the uncanny valley – make it cute! 
How does this use the different building blocks? Well, you can talk to Olli in multiple languages and Olli will hold a natural conversation with you. Olli uses machine learning to make route and destination predictions based on what you say…where you want to go and when you want to get there. To navigate, Olli needs to be able to interpret lidar and map data as well as a variety of environmental sensors and data feeds about traffic. It’s always looking for patterns and outliers.
Olli understands human concepts well enough to extract entities such as destinations from streams of conversation, and then respond with questions based on its constantly growing knowledge base. And finally, Olli is able to develop independent strategies for previously un-encountered situations on the road using a variety of planning and scheduling algorithms.
wonder

TJ Bot – another Tool Toy

Your turn

Ready to give create a human “machine” yourself? How about a cute robot that you can talk to? Once you have built (or bought) your robot, you can add Watson services that will make her chatty.  This demonstrates all the building blocks except for planning and scheduling (anyone up for a challenge?)

Final words

Augmentation of human expertise is one of the most promising areas of AI currently – an example of how machines can re-humanize as opposed to just automate and eliminate.  Smart advisors for doctors and scientists, childcare and nutrition advisors for parents.  Collaborative robots working with skilled workers.  Computational creativity in the kitchen.  I could go on.  It isn’t all doom and gloom in my opinion, but next post I’ll explore the broader implications for work and society and speculate a little more on what comes next.

Playlist for this blog

The wonder-ful ArchAndroid by Janelle Monae, of course!
 

IBM, Distinguished Engineer

More Industry Insights stories

More wonders

In my last post (“A time of wonders“) I finished on the idea that “machines like HAL or Her are referred to as super-intelligent or as strong artificial intelligence…and that most experts agree that we are probably 60 years away from being able to create systems like that, but there are many wonders available to […]

Continue reading

A time of wonders

Wonders indeed.  Last year a man’s life was saved in Missouri when his Tesla car drove him to hospital on autopilot. It seems that every day we read headlines like that one: you could call it a time of wonders, and all of them seem to be the result of artificial intelligence or AI. It […]

Continue reading

The machines that will change the world again

In many parts of the world it is a long weekend in celebration of Easter, and many parents will be hearing those famous words “are we there yet?” as they set off in their motor cars for a vacation (these are the ‘machines’ of the title in case you are wondering). Motor cars and our […]

Continue reading