February 19, 2020 By Jeffrey Palmer 3 min read

Learn how autonomous management will revolutionize your edge computing approach.

If you’re a CIO in retail, manufacturing, distribution, banking, or just about any other industry, you’re building a strategy that will empower your lines of business to increase revenue and cost savings by adopting a massively decentralized computing architecture, otherwise known as edge computing.   

However, you’re concerned that traditional management approaches aren’t designed to manage and secure a topology with tens of thousands of edge servers and hundreds of thousands of edge devices in a cost-effective manner.  

How to handle the scale, variability, and rate of change of 5G and enterprise edge applications

As computing expands, the scale, variability, and rate of change of edge environments present a challenge that traditional management software was not designed for. It was originally built for centralized topologies with many servers (often standardized) in a few data centers or public cloud regions, with infrequent changes to the environment.

Now with edge, you have tens of thousands of servers and hundreds of thousands of edge devices that are much more heterogeneous, deployed in thousands of remote locations, with new locations or edge devices and servers being continuously added.  It’s nearly impossible for an administrator to understand the topology and relevant differences, which is critical when attempting to deploy new applications to the edge.

“Developing a successful edge strategy requires taking into consideration the complexity of management endpoints and recognizing that scale and variability are dramatically different from traditional on-premises or public cloud deployments.”

IDC White Paper, sponsored by IBM, “The Importance of Effective Operations in Unlocking Edge IT Value,” January 2020

Move to autonomous management

The massive scale, variability, and rate of change of edge environments requires a new approach—an autonomous management approach. Management actions such as deployment negotiation, agreement, execution, and on-going validation of workloads must be offloaded from the human administrator and onto autonomous management software.

This is achieved by asynchronous communication between software agents on edge endpoints and a management hub that constitute the autonomous management software. The actions carried out are based on the administrator intent but without his or her intervention, even during new endpoint onboarding.

Watch to learn more about autonomous management:

Administrators express an intent and autonomous management software executes in line with the intent

Here’s an example: “Deploy this application on any edge server with OpenShift 4.2, 8 cores free, 2GB of memory, 1TB of storage, not located in Toronto, Canada, not running an application owned by dept ABCD.”

Edge software agents autonomously decide if the edge endpoints they represent meet the intent set by the administrator, and if so, automatically initiate the installation of the application. The software agents continuously check for agreement validity over time. For instance, if the version of Red Hat OpenShift is later upgraded to 4.3, the agent can notify the administrator that the agreement is no longer valid, or it can automatically shut down the application, all without administrator initiation or intervention.  

With autonomous management, a single administrator can manage tens of thousands of endpoints without human initiation or intervention. 

Your house is becoming autonomous, isn’t it time your management software did?

Take your next steps in edge computing

This is truly an amazing time. The convergence of 5G, edge computing, and AI will spark a level of innovation that hasn’t been seen before. IBM believes enterprises can use edge IT to enable faster insights and actions, maintain continuous operations, and provide new customer experiences.  

More from Cloud

Sensors, signals and synergy: Enhancing Downer’s data exploration with IBM

3 min read - In the realm of urban transportation, precision is pivotal. Downer, a leading provider of integrated services in Australia and New Zealand, considers itself a guardian of the elaborate transportation matrix, and it continually seeks to enhance its operational efficiency. With over 200 trains and a multitude of sensors, Downer has accumulated a vast amount of data. While Downer regularly uncovers actionable insights from their data, their partnership with IBM® Client Engineering aimed to explore the additional potential of this vast dataset,…

Best practices for hybrid cloud banking applications secure and compliant deployment across IBM Cloud and Satellite

10 min read - Financial Services clients are increasingly looking to modernize their applications. This includes modernization of code development and maintenance (helping with scarce skills and allowing innovation and new technologies required by end users) as well as improvement of deployment and operations, using agile techniques and DevSecOps. As part of their modernization journey, clients want to have flexibility to determine what is the best “fit for purpose” deployment location for their applications. This may be in any of the environments that Hybrid…

Level up your Kafka applications with schemas

4 min read - Apache Kafka is a well-known open-source event store and stream processing platform and has grown to become the de facto standard for data streaming. In this article, developer Michael Burgess provides an insight into the concept of schemas and schema management as a way to add value to your event-driven applications on the fully managed Kafka service, IBM Event Streams on IBM Cloud®. What is a schema? A schema describes the structure of data. For example: A simple Java class…

SSD vs. NVMe: What’s the difference?

7 min read - Recent technological advancements in data storage have prompted businesses and consumers to move away from traditional hard disk drives (HDDs) towards faster, lower-latency solid-state drive (SSD) technology. In this post, we’re going to look at this new technology, as well as the fastest and most popular protocol available to connect it to a computer’s motherboard—non-volatile memory express (NVMe). While the terms SSD and NVMe are often used to describe two different types of drives, they are actually different data storage…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters