Share this post:
Machines that think!
AI was on the cover of Time Magazine with an image showing Rodin’s statue ‘The Thinker’ embodied as circuitry in a microprocessor. It was the late 1980s and AI was an exciting field, promising to solve a whole new range of problems. I had just joined the workforce with a major in AI and within a couple of years I was helping customers test out expert systems and was part of the team representing IBM at the International Joint Conference of AI.
There were big times ahead for AI but then, things went quiet. Field trials worked but it was hard to move past the pilot stage, mainly due to the programming effort. You needed specialized skills and too much time and effort was needed to maintain code and develop new capabilities. AI fizzled and disappeared from site. I’d missed the banner on the Time Magazine Cover: “Computers of the Future”. I was disappointed and went into mainstream IT.
Fast forward a few years and IBM Deep Blue won chess matches against masters. With the help of IBM Research, I set up a First of a Kind project with a mobile service provider: we were using natural language speech to use a feature phone to access your calendar. For those who have only ever owned a smartphone and never bought a newspaper, a feature phone is the type of mobile phone that Nokia made popular when 2.5G and 3G was coming out – basically you could make phone calls and send texts, play slither and use a basic browser – so to be able to say “show me my calendar for tomorrow” and have it appear on the screen was groundbreaking. It worked well, field tests confirmed there was a major lift in productivity for the mobile worker, and the client loved it.
Then, as we moved from First of a Kind and pilot towards production, we ran into the same problem as a decade earlier: you needed specialized skills and too much time and effort was needed to maintain code and develop new capabilities. Fizzle – again. I was disappointed again and went back to mainstream IT. Again.
In the background though, IBM Research kept plugging away on AI. Another few years pass and IBM has Watson which wins jeopardy. This time around the Research team had progressed the science of Machine Learning and were using it to crack the code on what has blocked us in the past: Rather than program AI, you could train AI.
With Machine Learning, rather than code rules or use a special programming language, you collect data sets such as manuals and videos and feed that to the AI platform Watson. Watson reads and understand documents and ‘sees’ and understand images and videos. Watson then identifies correlations and anomalies in streams of client data, helping identify issues and predict outages. Having read the trouble tickets and manuals, Watson can also help the field worker fix the problem more quickly. Operators and field workers can use AI to work at the level of the client’s best operator and field worker.
This is all done using the customer’s own data, training being done by semi-technical business people not by specialist programmers, making it quicker and easier to deliver value. Watson provides a conversational interface, enabling the knowledge embedded in the client’s documentation to be easily interrogated.
It’s taken time to get here but it’s an exciting time. I have clients taking live feeds of event data from their networks, marrying it with weather and location data and using AI to know when and where an outage is likely to occur and what equipment was needed last time a similar fault occurred. I have clients using AI to monitor live video surveillance streams and identify anomalies such as when someone has left a package behind.
Better than all that, we can do these things quickly – often just a matter of days or hours to train the system using the client’s own data.
There might be a lot of marketplace buzz around AI but I can tell you from first hand experience, it takes a long time to build up capability in AI. IBM has been in this field a long time and the differentiation today is speed to value. One client recently told me that had they only issued an RFP for AI they would have thought seven out of seven vendors could give them an AI platform. The client ran a Proof of Concept at the same time as the RFP and they gave all seven vendors sample data. It turns out only two vendors found insights in their data and only one of the two had the ability to make those insights actionable by also giving them a short pathway to production.
I’m pleased that this time around, AI is finally here, not just entering the mainstream but delivering value quickly. Time Magazine had the right cover story.