Reclaim the power of AI!
seb_ 060000QVK2 Comment (1) Visits (10999)
For deep learning the two things I heard a lot that week were:
Those data scientists, much-vaunted luminaries of insight, spend weeks and months to create a model with just the right features to allow deep learning giving you the answers for your most pressing business questions. They crawl through your plethora of data records and work together with your subject matter experts to find out which data features contribute to the intended results. They are the few enlighted ones being able to control the oracle of AI.
Well, that feels a lot like the dawn of computers in the mid of the 20th century, doesn't it? Fueled by the success of the first real computers in and after world war 2 the first companies started to utilize those big electronic brains. And only a handful of experts were able to control them in the beginning. I don't have to tell you that it didn't stay that way.
Now you just work with it. For me and many others a computer is just a daily-work tool. The internet is just a tool. We use them to get, analyze, transform and distribute the most important material today: data. While digitalization is a long-term trend and the digital transformation of basically all industries is still ongoing, it's already an "old hat" for everybody working with data. Accountants, analysts, controllers, business developers, product managers, planners, dispatchers, brokers, project managers - all those typical "spreadsheet acrobats". They all try to get insights out of data and they all were somehow impacted by developments like analytics, big data, data warehouses, you name it.
The commodification of AI
And now they depend on data scientists? That might be true for this very moment in IT history, but this situation will not last long and it already started to change. Being able to construct purpose-built multi-layered neural networks might be a requirement now, but it won't be in the future. Actually writing code to implement them might be required now for many cases, but again: this won't last. IBM researchers already work on methods to directly convert theoretical deep learning models described in scientific papers into real live models ready to be trained by user data. But there are applications that are much more "down-to-earth". Two of them are:
H2O.ai Driverless AI
IBM offers H2o.ai's Driverless AI solution on top of PowerAI, our machine learning platform. It's called "driverless", because it basically replaces the data scientist in the process and returns the control back to the roles described above - the ones really owning the data. With Driverless AI they have a tool in their hand that enables them to utilize the possibilities of AI themselves and relieves them from the tedious and time-consuming feature engineering. Feature engineering is done to find out which parts of the data should be used for the model to get the best results. Now imagine a tool that resolves that by massive parallel trial and error. An intelligently optimized brute-force approach to feature engineering. Presented by a UI that keeps the coding away from you but shifts your focus back to the data. I tried it in an STU session and was quite impressed.