Over the last few years, more emphasis than ever before has been placed on the need to make sure the next generation grows up learning digital skills like coding and software development. Nothing encapsulates this idea better than this 2014 Mother Jones article which poses the question: Is Coding the New Literacy?
The answer, judging by the number of coding boot camps and early-education technology programs would seem to be yes. If there's one thing that technology has shown us repeatedly over the years, though, is that our predictions on the future of tech seem to be obsolete the moment we make them. That means we may be spending precious education resources and time preparing the next generation to deal with the challenges of today – not those of tomorrow.
Here's why I believe this to be the case.
Fighting the Last War
It's well established that the pace of technology innovation and adoption tends to be faster than we expect. That's why history is littered with so many predictions by tech luminaries that have proven laughably inaccurate. For example, DEC chairman and founder Ken Olson once observed that "There is no reason anyone would want a computer in their home." And Robert Metcalfe, the inventor of Ethernet, predicted in 1995 that "… the Internet will soon go spectacularly supernova and in 1996 catastrophically collapse."
Needless to say neither of those things happened, but they do reflect a mindset shared by technologists even today. It's less a matter of poor predictive ability and more that it's hard to see past today's realities when projecting trends into the future. That's the first reason that treating coding like the must-have skill of the 21st century is likely a poor prediction. It reflects the needs of today, not of tomorrow. The right time to have started programming in early education would have been years ago – as the Turtle LOGO programming initiative attempted to in the early 1980s.
Technology Already Evolving
The second reason that treating coding skills as a must for today's students is misguided is the fact that the technology industry is already evolving away from human-powered coding. Already, advances in machine learning are leading to computers that can handle most basic coding tasks with no human intervention. A prime early example of this is DeepCoder; an AI system that uses existing code snippets found online to create whole new programs to satisfy programmatic requirements it's given.
DeepCoder, and solutions like it, rely on the fact that today's programmers don't write much in the way of unique code. They search the internet for help from other programmers or existing solutions to the functions they're trying to incorporate into their work. It's the same concept that also animates platforms like IBM's Watson Studio, which enables people to construct machine learning models without a single line of code. If today's AI solutions are already capable of such feats, what are the odds that today's students will ever write code in their future?
A Better Approach
None of this is to say that we shouldn't be making technology education a staple in our schools. Instead, we should be focusing our attention on where technology is going, rather than on where it is right now. For example, as AI solutions start to displace coders today, it's easy to predict that AI development will be a bedrock tech industry for the next generation.
We should be leaning into that reality by teaching today's students how to apply computational thinking to problem-solving. It's a more generalized subject that can shape how today's students think in ways that will compliment tomorrow's AI solutions. That would be a more productive and forward-looking use of resources, rather than spending them on teaching today's skills to tomorrow's workforce. Of course, we can also choose to keep guessing as to what today's kids will need to know in the future – but if our past predictions are any indicator, we'd be better off throwing darts to build new curricula.