May 13, 2017 | Written by: Trevor Davis
Categorized: Industry Insights
Wonders indeed. Last year a man’s life was saved in Missouri when his Tesla car drove him to hospital on autopilot. It seems that every day we read headlines like that one: you could call it a time of wonders, and all of them seem to be the result of artificial intelligence or AI.
It seems that you can pick up any newspaper or magazine and read no end of miraculous applications of artificial intelligence techniques: I particularly like the example of the man who made his own robot and decided to marry it!
Man and Wife
Man and WifeThere are also many freely available software libraries and frameworks with strange names such as TensorFlow and Apache Mahout.
But how far has artificial intelligence progressed really? Is it just the latest over-hyped subject from IT vendors? Indeed, what do we mean by artificial intelligence and terms like machine learning and deep learning? Are they the same thing?
What I’d like to do in this post and subsequent ones is to give you an understanding of how we got to where we are today and what can really be achieved, and how I expect the field to mature in the future, particularly in the context of FMCG companies.
Lessons from history
Let me start with some history. The British computer scientist Alan Turing is seen by many as the father of the concept of artificial intelligence: below this is a picture of the publication from 1950 in which he asked the question “can machines think?”
In 1956 there was a workshop at Dartmouth College in America which brought together some of the great thinkers in the field who predicted the computers would be able to think like human beings within a few decades. This promise attracted millions of dollars of funding, and the years after are known to computer scientists as the first summer of artificial intelligence. And as we all know summer is followed inevitably by winter and this has been the characteristic of artificial intelligence development up until today: big promises, big investments, insufficient progress and an end to research. That’s what happened in 1973 when most major government projects in the US and the UK came to a halt.
There was then a gap of about seven years until a visionary project started in Japan and this was known as the fifth-generation project. Although this also followed the boom and bust trend it did deliver some powerful advances such as expert systems. In fact I was involved in this research in the middle of the 1980s, trying to recreate all the material science knowledge of my boss who was due to retire from ALCAN.
From the end of the 80s until the early 90s there was another winter, and this was followed by successes that took advantage of major improvements in computing performance such as the NVIDIA Graphical Processing Unit or GPU in the late 90’s. Perhaps the best-known of these was the IBM chess playing computer Deep Blue that beat the reigning world chess champion Garry Kasparov 20 years ago this month. The big successes in this period came from focusing on more specific problems like speech recognition, domain specific knowledge graphs (such as chess) and search (leading to Google).
The successes in this period were sufficient to ensure that funding, both public and private, continued. By the time we get to 2011 we have the IBM Watson computer beating human beings on the TV quiz show Jeopardy!
The Avatar for Watson
And Google demonstrating self-driving cars in 2012. Anyone with an iPhone will be immediately familiar with intelligent agents like Siri.
But something more fundamental happened in this period: the rise of big data and new approaches inspired by how the brain works. This combination gave birth to a new family of techniques called deep learning that lie behind many of the “wonders” we see today.
Is (another) Winter coming?
Yet for all of this progress the evolution of artificial intelligence since the 1950s is really evolutionary and we are still some years away from the HAL 9000 computer or Her from the film of the same name.
Machines like HAL (do read this letter from 2001 Director, Stanley Kubrick about IBM involvement in the film) or Her are referred to as super-intelligent or as strong artificial intelligence. Many experts suggest that we are probably 60 years away from being able to create systems like that, but there are many wonders available to us in the meantime so long as we can avoid another AI winter.
More to come next time!