Setting an AI strategy to unlock the value of your data
It’s been said that data is the most valuable resource on the planet. But most companies aren’t getting the maximum value out of their data. If you look at the top three things that are really needed in the marketplace, it’s really been around defining a data strategy, filling the skill shortage and how to operationalize and industrialize your AI. In fact, while AI is helping companies gain competitive advantage in a growing range of industries, 51 percent find optimizing, sustaining and expanding AI capabilities challenging, notes Forrester.
Why is that? The fact is that, in a business context, AI is fairly new. It can also be a bit intimidating. So, I’d like to share some quick thoughts how to build an AI strategy and how to measure its success.
Formulating your AI strategy
Ultimately, of course, the success of any AI project has to be measured in dollars and cents. As you formulate an AI strategy, it’s important to identify and prioritize the use cases that have the greatest potential to generate value in terms of cost savings or net new revenue. Having said that, while it’s appealing to look for a big bang project that promises great benefits, it may not be prudent to go for the biggest bang first. Instead, a good approach is to focus on smaller projects that can deliver the best value in the shortest amount of time. Additionally, these big bang use cases should be broken down into their component parts. The main reason for this is two-fold: first, you need to create value quickly. Second, big projects do not fit into an Agile methodology.
Leveraging Agile methodologies in the data science process is the way to go. With Agile, you can start delivering value quickly, in two-week sprints. If you have some grand overall objective, you can break it down into smaller, lower risk components that have the potential to be duplicated or repurposed in other areas of the business. It’s also important to use good coding practices so that, even if your project is somewhat experimental, the AI can go directly into production, without the code having to be rewritten at the end.
If you love your data scientists, set them free
Companies today are collecting enormous volumes of data. But they are struggling to find the skills needed to operationalize AI. There is a severe shortage of data scientists. Fortunately, however, there are tools that can help you dramatically increase the productivity of your data scientists.
For example, we offer a solution called AutoAI with IBM Watson Studio, which automates many manual steps in the AI lifecycle management process—steps that can consume as much as 80 percent of a data scientist’s time. This solution frees them for the value-added work of optimizing and customizing AI to meet the needs of the business. When you feed data into AutoAI, it identifies the features that are most important. It does feature engineering and model selection, as well as hyperparameter tuning and optimization. AutoAI can be integrated with Watson OpenScale, which can detect and correct both bias and a drift in accuracy in AI models. Both offerings are part of IBM Cloud Pak for Data.
Automating manual steps helps data scientists do more satisfying work and have a bigger impact, and that helps organizations attract and retain the best data science talent.
Another benefit of AutoAI is that it generates Python code which can be re-used with some fine tuning. It enables data scientists to better understand the model and enhance it. Having the Python code available also allows extraction of important parts of the model, such as engineered features. These can be offered in a feature store, such as the one in Watson Knowledge Catalog, a part of IBM Cloud Pak for Data, and then put to valuable further use.
AI for your AI
A great example of how automation helps enhance the productivity of data scientists was offered by Wunderman Thompson at this year’s IBM Data and AI Forum. Their story is one of the most impactful use cases I’ve heard. They are a creative agency with about 200 offices worldwide and they had a serious problem with their data sets, which had about 17,000 features. Every time the company wanted to produce a custom solution for a customer, a data scientist had to determine which of the 17,000 features was right for that particular use case and then engineer them to meet the customer’s needs. It was an intractable problem that they’d been wrestling with for eight years. Then they contacted IBM and the IBM Data Science Elite team, and through using AutoAI, were able to solve the problem and get the AI into production in only two months. The models that were produced outperformed previous models by 200 percent.
Learn more about how they did this by watching my webcast conversation with Wunderman Thompson’s President and Chief Product Officer Michael Murray. And get more tips on scaling AI for growth and innovation.
Get the business team onboard
Finally, as you plan a project, make sure that your business users are involved and will be ready and willing to use the AI once it’s deployed. After all, the best AI in the world has no value if it doesn’t get used. As you gain experience and start delivering results, you can build support internally, and generate the confidence and enthusiasm you need to tackle more and bigger AI projects in the future.
Share the excitement about AI
The potential of AI is very exciting for those of us who live and breathe data science. But let’s face it, change is hard. Watching competitors pass you by though, is even harder. So, take a look at our client success stories and see how much value AI and data science is bringing to organizations like yours. I think you’ll get excited too.
Please visit the IBM Cloud Pak for Data web page to learn more about IBM AI capabilities.
Join my webcast conversation with Wunderman Thompson’s President and Chief Product Officer Michael Murray to learn tips on how to scale AI for growth and innovation throughout your organization.