For centuries, electricity was thought to be the domain of sorcerers – magicians who left audiences puzzled about where it came from and how it was generated. And although Benjamin Franklin and his contemporaries were well aware of the phenomena when he proved the connection between electricity and lightning, he had difficulty envisioning a practical use for it in 1752. In fact, his most prized invention had more to do with avoiding electricity – the lightning rod. All new innovations go through a similar evolution: dismissal, avoidance, fear, and perhaps finally acceptance.


Today, too many people view artificial intelligence (AI) as another magical technology that’s being put to work with little understanding of how it works. They view AI as special and relegated to experts who have mastered and dazzled us with it. In this environment, AI has taken on an air of mysticism with promises of grandeur, and out of the reach of mere mortals.

The truth, of course, is there is no magic to AI. The term Artificial Intelligence was first coined in 1956 and since then the technology has progressed, disappointed, and re-emerged. As it was with electricity, the path to AI breakthroughs will come with mass experimentation. While many of those experiments will fail, the successful ones will have substantial impact.

That’s where we find ourselves today. As others, like Andrew Ng have suggested, AI is the new electricity. In addition to it becoming ubiquitous and increasingly accessible, AI is enhancing and altering the way business is conducted around the world. It is enabling predictions with supreme accuracy and automating business processes and decision-making. The impact is vast, ranging from greater customer experiences, to intelligent products and more efficient services. And in the end, the result will be economic impact for companies, countries, and society.

To be sure, organizations that drive mass experimentation in AI will win the next decade of market opportunity. To breakdown and help demystify AI, one needs to consider two key elements of the category: the componentry and the process. In other words, identifying what’s behind it and how it can be adopted.

The Componentry

Much like electricity was driven by basic components such as resistors, capacitors, diodes, etc., AI is being driven by modern software componentry:

  1. A unified, modern data fabric. AI feeds on data, and therefore data must be prepared for AI. A data fabric acts as a logical representation of all data assets, on any cloud. It pre-organizes and labels data across the enterprise. Seamless access to all data is available through virtualization from the firewall to the edge.
  2. A development environment and engine. A place to build, train, and run AI models. This enables end-to-end deep learning, from input to output. Machine learning models, help find patterns and structures in data that are inferred, rather than explicit. This is when it starts to feel like magic.
  3. Human features. A mechanism to bring models to life, by connecting models and applications to human features like voice, language, vision, and reasoning.
  4. AI management and exploitation. This enables you to insert AI into any application or business process, while understanding versions, how to improve impact, what has changed, bias, and variance. This is where your models live for exploitation and enables lifecycle management of all AI. Lastly, it offers proof and explain-ability for decisions made by AI.

The Process

With these components in hand, more organizations are unlocking the value of data. But to fully leverage AI, we must also understand how to adopt and implement the technology. For those planning the move, consider these fundamental steps first:

  1. Identify the Right Business Opportunities for AI. The potential areas for adoption are vast:  customer service, employee/company productivity, manufacturing defects, supply chain spending, and many more. Anything that can be easily described, can be programmed. Once it’s programmed, AI will make it better. The opportunities are endless.
  2. Prepare the Organization for AI. Organizations will require greater capacity and expertise in data science. Many of today’s repetitive and manual tasks will be automated, which will evolve the role of many employees. It’s rare that an entire role can be done by AI. But it’s also rare that none of the role could be enhanced by AI. All technology is useless without the talent to put it to use, so build a team of experts that will inspire and train others.
  3. Select Technology & Partners. While it’s unlikely that the CEO will personally select the technology, the implication here is more of a cultural one. An organization should adopt many technologies, comparing, contrasting, and learning through that process. An organization should also choose a handful of partners that have both the skills and technology to deliver AI.
  4. Accept Failures. If you try 100 AI projects, 50 will probably fail. But, the 50 that work will be more than compensate for the failures. The culture you create must be ready and willing accept failures, learn from them, and move onto the next. Fail-fast, as they say.

AI is becoming as fundamental as electricity, the internet, and mobile as they were born into the mainstream. Not having an AI strategy in 2019 will be like not having a mobile strategy in 2010, or an Internet strategy in 2000.

Let’s hope that when you look back at this moment in history, you can do so fondly, as someone who embraced data as the new resource and AI as the utility to harness it.

______________________________________________

A version of this story first appeared on Informationweek.

 

Related Stories:

Was this article helpful?
YesNo

More from Analytics

How IBM Data Product Hub helps you unlock business intelligence potential

4 min read - Business intelligence (BI) users often struggle to access the high-quality, relevant data necessary to inform strategic decision making. These professionals encounter a range of issues when attempting to source the data they need, including: Data accessibility issues: The inability to locate and access specific data due to its location in siloed systems or the need for multiple permissions, resulting in bottlenecks and delays. Inconsistent data quality: The uncertainty surrounding the accuracy, consistency and reliability of data pulled from various sources…

Watsonx.data introduces support for a suite of modern dataops tools

2 min read - We’re excited to announce that IBM® watsonx.data™ now supports a powerful suite of tools for the modern dataops stack: data-build-tool, Apache Airflow, and VSCode. With data build tool (dbt) compatibility for both Spark and Presto engines, automated orchestration through Apache Airflow, and an integrated development environment via VSCode, watsonx.data offers a new set of rich capabilities. These features empower teams to efficiently build, manage and orchestrate data pipelines. The challenge of complex data pipelines Organizations today face the challenge of…

IBM Planning Analytics: The scalable solution for enterprise growth

5 min read - Companies need powerful tools to handle complex financial planning. At IBM, we've developed Planning Analytics, a revolutionary solution that transforms how organizations approach planning and analytics. With robust features and unparalleled scalability, IBM Planning Analytics is the preferred choice for businesses worldwide. We’ll explore the aspects of IBM Planning Analytics that set it apart in the enterprise performance management landscape. We delve into its architecture, scalability and core technology, highlighting its data handling capabilities and modeling flexibility.We'll also showcase its…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters