What do oil platforms, automobiles, and smart phones have in common? It may not be obvious, but they are in fact all important examples of edge computing.
The concept of edge computing has evolved significantly over the last half century. In the past, an oil platform in the North Sea might offer an excellent example of the “edge” of an organization’s IT infrastructure or internal network. The North Sea lies between the UK and northern Europe and boasts 184 offshore rigs–the highest number of any region in the world.Because network bandwidth may be minimal and connectivity intermittent at best in the remote, stormy North Sea, some compute resources are on the oil platforms themselves, far away from company headquarters. But building a full data center in the North Sea isn’t necessary; it just wastes money and resources.
Traditionally, branch offices, factories, remote operational locations, research stations, and similar environments were all common examples of edge locations. But the advent of new technologies and architectures such as smart devices and the Internet of Things (IoT) is ushering in a whole new paradigm of edge computing. Now cars, for example, have essentially become edge locations. And one of the most common objects on our planet these days–the smart phone–is an edge device.
Interestingly, artificial intelligence (AI) is going to drive an explosion of edge computing demand. Imagine when an edge-located car is an AI-powered autonomous driving (AD) vehicle. Along with all the telemetry cars already produce and broadcast to home base, when the car is an AI/AD system, it is expected that data volumes will reach terabytes per vehicle per day and hundreds of exabytes across entire AD initiatives.
Such data streams will overwhelm IoT backbones. Then, add to that flood of data all the increased traffic and amplified smart device user expectations from the rollout of new 5G networks. Bandwidth may increase substantially with 5G, but demand and usage are set to really explode.
Edge computing offers a powerful strategy to help alleviate future network congestion driven by new technologies such as AI, the IoT, and 5G. What if edge devices didn’t call home? Or at least called the neighbors first? What if a new breed of edge installations evolved where intermediate compute resources functioned to intercept much of the raw data streaming in from AD vehicles and smart phones and rich media-augmented reality games and entertainment clients?
Perhaps not so different from North Sea oil platforms or remote office locations, this new breed of IT infrastructure edge installations would provide the initial data processing resources much closer to the individual AD vehicle, smart phone, and gaming console. Local computing resources could help manage automobile traffic flows or provide augmented reality data for smart phones, for example, without the need to “phone home” to a central location.
How would we benefit? Clearly, network traffic and potential congestion and contention would be reduced. Data needed for analytics-driven business decision-making back at corporate HQ would still be available, just not on as short a fuse. Just as importantly, local processing would slash the response times of nearby remote client applications, leading to better end-user experiences. In fact, the business success of the 5G rollout, for example, may hinge largely on the quality of end-user experiences. When movies download faster, when AD vehicles inspire confidence, when game avatars move fluidly, when shopping apps respond instantly, end users are much happier–and buy more.
But the 21st century versions of edge computing won’t happen by magic, though the results may seem almost magical. Edge compute solutions still will struggle against the same sorts of challenges as those in the North Sea. Cost is always a big concern. And cost is always tied to efficiency, which is often linked to scalability and flexibility. Performance can’t be sacrificed. And who will be out there on the edge managing remote installations?