December 5, 2019 By Rodrigo Ceron 3 min read

It’s not unusual today to see people talking about artificial intelligence (AI). It’s in the media, popular culture, advertising and more. When I was a kid in the 1980s, AI was depicted in Hollywood movies, but its real-world use was unimaginable given the state of technology at that time. While we don’t have robots or androids that can think like a person or are likely to take over the world, AI is a reality now, and to understand what we mean when we talk about AI today we have to go through a — quick, I promise — introduction on some important terms.

AI is…

Simply put, AI is anything capable of mimicking human behavior. From the simplest application — say, a talking doll or an automated telemarketing call — to more robust algorithms like the deep neural networks in IBM Watson, they’re all trying to mimic human behavior.

Today, AI is a term being applied broadly in the technology world to describe solutions that can learn on their own. These algorithms are capable of looking at vast amounts of data and finding trends in it, trends that unveil insights, insights that would be extremely hard for a human to find. However, AI algorithms can’t think like you and me. They are trained to perform very specialized tasks, whereas the human brain is a pretty generic thinking system.

Fig 1: Specialization of AI algorithms

Machine learning

Now we know that anything capable of mimicking human behavior is called AI. If we start to narrow down to the algorithms that can “think” and provide an answer or decision, we’re talking about a subset of AI called “machine learning.” Machine learning algorithms apply statistical methodologies to identify patterns in past human behavior and make decisions. They’re good at predicting, such as predicting if someone will default on a loan being requested, predicting your next online purchase and offering multiple products as a bundle, or predicting fraudulent behavior. They get better at their predictions every time they acquire new data. However, even though they can get better and better at predicting, they only explore data based on programmed data feature extraction; that is, they only look at data in the way we programmed them to do so. They don’t adapt on their own to look at data in a different way.

Deep learning

Going a step narrower, we can look at the class of algorithms that can learn on their own — the “deep learning” algorithms. Deep learning essentially means that, when exposed to different situations or patterns of data, these algorithms adapt. That’s right, they can adapt on their own, uncovering features in data that we never specifically programmed them to find, and therefore we say they learn on their own. This behavior is what people are often describing when they talk about AI these days.

Is deep learning a new capability?

Deep learning algorithms are not new. They use techniques developed decades ago. I’m a computer engineer, and I recall having programmed deep learning algorithms in one of my university classes. Back then, my AI programs had to run for days to give me an answer, and most of the time it wasn’t very precise. There are a few reasons why:

  • Deep learning algorithms are based on neural networks, which require a lot of processing power to get trained — processing power that didn’t exist back when I was in school.
  • Deep learning algorithms require lots of data to get trained, and I didn’t have that much data back then.

So, even though the concepts have been around, it wasn’t until recently that we could really put deep learning to good use.

What has changed since then? We now have the computing power to process neural networks much faster, and we have tons of data to use as training data to feed these neural networks.

Figure 2 depicts a little bit of history of the excitement around AI.

Fig 2: The excitement around AI began a long time ago

Hopefully now you have a clear understanding of some of the key terms circulating in discussions of AI and a good sense of how AI, machine learning and deep learning relate and differ. In my next post, I’ll do a deep dive into a framework you can follow for your AI efforts — called the data, training and inferencing (DTI) AI model. So please stay tuned.

Meanwhile, if you have questions about AI on IBM Power Systems, or if you’re looking to consult with experienced technical professionals on an AI solution for your business, contact IBM Systems Lab Services.

Was this article helpful?

More from Cloud

IBM Tech Now: April 8, 2024

< 1 min read - ​Welcome IBM Tech Now, our video web series featuring the latest and greatest news and announcements in the world of technology. Make sure you subscribe to our YouTube channel to be notified every time a new IBM Tech Now video is published. IBM Tech Now: Episode 96 On this episode, we're covering the following topics: IBM Cloud Logs A collaboration with IBM and Anaconda IBM offerings in the G2 Spring Reports Stay plugged in You can check out the…

The advantages and disadvantages of private cloud 

6 min read - The popularity of private cloud is growing, primarily driven by the need for greater data security. Across industries like education, retail and government, organizations are choosing private cloud settings to conduct business use cases involving workloads with sensitive information and to comply with data privacy and compliance needs. In a report from Technavio (link resides outside, the private cloud services market size is estimated to grow at a CAGR of 26.71% between 2023 and 2028, and it is forecast to increase by…

Optimize observability with IBM Cloud Logs to help improve infrastructure and app performance

5 min read - There is a dilemma facing infrastructure and app performance—as workloads generate an expanding amount of observability data, it puts increased pressure on collection tool abilities to process it all. The resulting data stress becomes expensive to manage and makes it harder to obtain actionable insights from the data itself, making it harder to have fast, effective, and cost-efficient performance management. A recent IDC study found that 57% of large enterprises are either collecting too much or too little observability data.…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters