Share this post:
What will IT look like a decade from now?
We’ve seen enormous growth in big data and analytics in the past decade, and this area of technology will continue to transform how business is done. Not just that — but machine learning breakthroughs will bring new ways of analyzing and using that data.
To start us off thinking about the future of IT, let’s divide IT into three segments based on how problems are solved: by manually coding a model, by collecting and comparing lots of examples, or by automatic modelling (AI and deep learning).
Let’s call these segments:
1. Procedural IT
Procedural data processing is what we’ve traditionally been doing in IT: Solving a problem by modelling its parameters, collecting the relevant data and calculating a singular result. We call it structured processing or “if…then” logic. Today this still makes 90 percent of all IT tasks — but users increasingly feel that for many real-world problems, an “educated guess” would be more appropriate than a singular result.
2. Statistical IT
Statistical IT leverages “big data”— tons of accumulated examples that can be referred to in real time. This is a growth area, as huge quantities of data are being collected every day by smartphones and Internet of Things technologies. Examples for statistical IT include online shopping recommendations (people who viewed this item also bought…), consumer profiling or fraud detection. Data processing frameworks like Apache Spark and Hadoop can help organizations process these massive samples in real time and get insights and recommendations.
3. Machine learning IT
Machine learning IT will entirely outgrow the other segments over the next 10 years. Reasons include advances in automatic learning technologies, but also a lack of “legacy” programmers among the next generation of IT professionals, compared with the magnitude of problems to be solved. These include autonomous driving, human interaction, serious gaming, multi-language dialogue and many more. IBM Watson playing Jeopardy! was an early example of such a learning system: The answers were all in the data, but the actual problem was impossible to model in its entirety. The machine had to balance its own strategies for winning.
Within the six years since IBM Watson, machine learning has learned to play complex board games like chess better than any programmed algorithm. Before long, the same will be true for many other coded tasks.
Breakthroughs in machine learning
Artificial intelligence has been in the realm of science fiction for many decades. But in 2006 a stealthy breakthrough has ignited a revolution: unsupervised feature learning. It’s what toddlers’ brains do all the time: They learn even without a teacher. For machines, this is new.
With machine learning, IT professionals are no longer programming the machine; instead they are training it to find adequate strategies to solve the problem. With machine learning, we’ve found a way to teach neural networks to understand picture elements that make up a face in the same way that they can recognize partial problems that combine into a good winning strategy.
Besides recognizing people or gender, these same algorithms can learn how to read medical X-ray images, or how to do quality inspection for electronic circuit board manufacturing. The real revolution here is the simplicity and accessibility of these technologies to every programmer. Upload a set of “good” examples and “bad” examples, and voila — your hybrid AI can classify future images in these categories. Try it for yourself in IBM Watson Developer Cloud.
Machine learning isn’t the only thing looming in the future of IT. Stay tuned for the second part of this series, where I’ll discuss what neural processing and deep learning could mean for the future of business.
If your company is looking for insights on how to become a cognitive business, contact IBM Systems Lab Services today. We have leaders with proven experience delivering cognitive solutions to leading organizations around the world.