Avoiding the "Megapixel Method" When It Comes to AI

3 min read

Key considerations to help make the best decision about the AI for you.

A number of years ago, I was in the market for a good digital camera. As you’d expect, the internet provided an overwhelming amount of information on the topic. One thing always stuck in my mind, though — the industry and consumers had adopted the number of megapixels as the Key Performance Indicator (KPI) of picture quality without a really good reason to do so. Megapixels is just one KPI from a large number of factors.

Even if you look at digital cameras in phones today, the number of megapixels is still prominently displayed. If you buy a camera solely on megapixels, you may not be very happy to learn that cameras with fewer megapixels can often take better pictures.

From an artificial intelligence (AI) perspective, I am jealous. Rightly or wrongly, when it comes to cameras, there is an agreed-upon benchmark that consumers look for and understand. Unfortunately, the same is not true for AI. There is no single benchmark for a customer to assess, for example, whether one vendor’s AI is better than a competitor’s AI.

However, there are certainly key considerations to help you make the best decision.

Your needs

Let’s assume the value is similar in multiple AI products that you are evaluating. They all claim to have the ability to do anomaly detection on log and metric data, event correlation and the ability to analyse and use unstructured data.

Data requirements

Does a potential vendor’s AI work on all of your data? Even your proprietary data that is relevant to your business? If not, why not? Perhaps it is tied to a specific technology as it relies on hard-coded rules, content or uses pre-trained models. The ideal solution would work with the data you have today and what you may have in the years to come.

Does the AI work immediately? The ideal solution would work with the data that is available — whether that is hours, days or years — and use the best AI to suit it (and do so automatically). Using one set of techniques on sparse or new data and perhaps graduating to more sophisticated deep learning approaches when a large amount of data has been accrued — but, fundamentally, providing value regardless.

In IBM Cloud Pak® for Watson AIOps, our AI works with all of your data — both IBM and non-IBM. We use a variety of techniques to ensure that we can give the best insights with the data you have available and the sophistication to continue to yield better results with more data.

Tuning and configuration

How does the AI train? How is it tuned? When you purchase the product, do you have to pay for an additional service to make it work? Do you incur more charges if the vendor has to come back in six months and optimise it again?

In my opinion, if a human has to tune the AI after it is deployed, then it is not really AI. The AI should know whether or not it is doing a good job based on feedback, or based on internal assessment, and adjust accordingly. If you are being asked for the parameters to configure the AI, that is also suspect. The ideal parameters can be best determined by analysing the data, and the AI should do this automatically. In fact, the only way I would know which parameters to accurately specify is to run one or more experiments, which the AI should do itself.

In IBM Cloud Pak for Watson AIOps, our AI is self-tuning and self-configuring. There will not be an ongoing investment required to ensure the AI continues to perform at its best.

Scale

How much data can the AI analyse? Can it do it efficiently? In modern environments where everything is in the cloud, customers need to be able to deploy the analytics on servers they have available, which will typically have limited CPU and memory. Gone are the days when we can ask a customer for a large monolithic server — they are scarce in the cloud or in virtualised environments. The AI must be able to distribute over a number of small servers and scale to their needs.

In IBM, we have some of the largest customers in the world, and one of our challenges is to ensure the AI can meet their needs to scale and run on the hardware they have available. Having AI that is optimised, distributable and scalable is a minimum requirement for delivery.

Implementing AI that’s fit for purpose

There may not be a unified benchmark for AI across industry and customers, but there are core considerations and key questions to ask to understand if the AI is real and that it is fit for purpose.

Learn more about IBM Cloud Pak® for Watson AIOps.

Be the first to hear about news, product updates, and innovation from IBM Cloud