December 3, 2019 By Ashok Iyengar 6 min read

How do DevOps and Edge computing interact?

Several people have pointed out, rightfully so, that edge computing blogs mainly deal with the operational side of things, such as device registration and operations, network management, etc.

This fourth blog in the series on edge computing touches on DevOps in an attempt to answer questions like the following:

  • Is there any coding to be done when it comes to edge computing solutions?
  • What do edge applications look like?
  • Is there a programming model for edge applications?

If the distributed architecture of edge computing means running applications and services on those edge devices, there has to be a methodology to code, test, deploy, and run those apps.

For more background on edge computing, please see the following video:

Please make sure to check out all the installments in this series of blog posts on edge computing:

Non-functional requirements (NFRs)

The non-functional requirements (NFRs) for DevOps or DevSecOps are well known—continuous integration, continuous delivery, continuous testing, and continuous deployment, all in a secure framework.

There are a few nuances when it comes to developing and deploying edge applications and solutions—scaling, types of devices, application footprint, operating speed and disconnected.

While there is a need for providing a framework to securely deliver consistent and reliable software to the devices quickly, something unique to edge is the scaling factor. Imagine the second largest bank in the United States—which has 16,220 ATMs across the country—wanting to apply an update to its ATMs.

There are a plethora of edge devices. They come in different shapes, sizes, makes, and models from hundreds of manufacturers. Coding to the lowest common denominator makes things interesting. Some are audio devices, some are visual devices, while others have audio-visual capabilities. But the mantra developers have to keep in mind is “write once, deploy everywhere.”

Then, there is the size of edge applications. We have all heard of cloud-native development when dealing with the cloud. With edge, we are looking at edge-native applications, which tend to have a smaller footprint. And, more importantly, they have to operate at very high speeds, especially when it comes to inferencing at the edge. Note that there are hardware accelerators that significantly improve the performance of inference applications on the edge.

Finally, the edge, unlike the cloud, can be unstable and even disconnected by design. There can be many points of failure in an edge solution. “Keep it simple” is not a cliché, but a rule of edge-native applications since they have to be ready to scale back to the cloud at any point. Any and all data at the edge should be considered ephemeral.

Tools, toolchains, and frameworks

Providing tools and a framework for securely delivering consistent and reliable software as fast as possible to all connected devices will be key. The previous blog in this series, “Architecting at the Edge,” showed the IBM Edge Computing Reference Architecture:

In it, you will see that Linux and Docker containers are the most common technologies in vogue when it comes to edge-based applications, along with Kubernetes for container orchestration.

One could avail of any toolchain that deploys to Kubernetes or Docker (examples can be found here). A Git repository, an Eclipse or Web IDE, and a delivery pipeline with Jenkins and Terraform would be the main components in the toolchain.

There are two very distinct endpoints in an edge computing solution—edge servers and edge devices. We envision two toolchains. The first toolchain would be used to deploy the edge server infrastructure:

A sample DevOps toolchain for deploying Edge server infrastructure.

The second toolchain would have at least two pipelines that would be used to deploy applications to the edge servers (which are Kubernetes-based container platforms) and the other would deploy to edge devices based on ARM architecture (running Docker-based applications):

A sample DevOps toolchain for deploying Edge applications.

Edge applications

Let’s look at developing our first edge app—you guessed it, a Hello World service, running on a device like Raspberry Pi. One needs a Docker Hub ID and access to GitHub. The following are the high-level deployment steps:

  • Set up IBM edge infrastructure
    • Install Edge Exchange and Agreement Bot (agbot)
  • Develop Edge application
    • Build, test, push, and publish the service to the IBM Edge Exchange
  • Deploy edge application
    • Register edge node to run this pattern
    • Install the agent on edge device and configure it to point to the Edge Exchange
    • Device will make an agreement with one of the Edge Exchange agbots
    • Run the pattern and observe the output

Given that the Hello World is a simple application, it would only be deployed to the edge devices. If it were an artificial intelligence (AI) application, the full-blown version of it would be deployed on the edge server while a leaner version would get deployed on the Edge devices.

The code for the simple Hello World service is shown below. It outputs a line that says “Hello World” every three seconds:

#!/bin/sh
# Very simple sample edge service.
while true; do
echo "$HZN_DEVICE_ID says: Hello World!!"
sleep 3
done

The detailed steps and code for the HelloWorld service can be found on GitHub.

The link to instructions on how to deploy workloads to the edge can be found in the references section.

Edge policies

Policies are rules or constraints that provide much finer control over deployment placement of edge services by edge node owners, service code developers, and deployment owners. Policies are used to restrict where a service can be run, requiring a particular hardware setup such as CPU/GPU constraints, memory constraints, specific sensors, actuators, or other peripheral devices.

With the Exchange created, the Service coded, the Node registered, and Policies specified, the device—in this case the Raspberry Pi—should be continuously transmitting the Hello World message as shown below:

Aug 15 18:21:21 raspberrypi workload-58c94e71fece2d994e187d07b6bd179cc798fa0b79ddfe8c017c6fbc4cd9a47f_ibm.helloworld[452]: mynode says: Hello World!!
Aug 15 18:21:24 raspberrypi workload-58c94e71fece2d994e187d07b6bd179cc798fa0b79ddfe8c017c6fbc4cd9a47f_ibm.helloworld[452]: mynode says: Hello World!!
Aug 15 18:21:27 raspberrypi workload-58c94e71fece2d994e187d07b6bd179cc798fa0b79ddfe8c017c6fbc4cd9a47f_ibm.helloworld[452]: mynode says: Hello World!!

The IBM Cloud architecture center offers up many hybrid cloud and multicloud reference architectures, including the newly published edge computing reference architecture.

For more information on edge computing, the other parts of this series and a few other important references:

Thanks to David Booz and Steven Cotugno for reviewing the article.

More from Cloud

Hybrid cloud examples, applications and use cases

7 min read - To keep pace with the dynamic environment of digitally-driven business, organizations continue to embrace hybrid cloud, which combines and unifies public cloud, private cloud and on-premises infrastructure, while providing orchestration, management and application portability across all three. According to the IBM Transformation Index: State of Cloud, a 2022 survey commissioned by IBM and conducted by an independent research firm, more than 77% of business and IT professionals say they have adopted a hybrid cloud approach. By creating an agile, flexible and…

Tokens and login sessions in IBM Cloud

9 min read - IBM Cloud authentication and authorization relies on the industry-standard protocol OAuth 2.0. You can read more about OAuth 2.0 in RFC 6749—The OAuth 2.0 Authorization Framework. Like most adopters of OAuth 2.0, IBM has also extended some of OAuth 2.0 functionality to meet the requirements of IBM Cloud and its customers. Access and refresh tokens As specified in RFC 6749, applications are getting an access token to represent the identity that has been authenticated and its permissions. Additionally, in IBM…

How to move from IBM Cloud Functions to IBM Code Engine

5 min read - When migrating off IBM Cloud Functions, IBM Cloud Code Engine is one of the possible deployment targets. Code Engine offers apps, jobs and (recently function) that you can (or need) to pick from. In this post, we provide some discussion points and share tips and tricks on how to work with Code Engine functions. IBM Cloud Code Engine is a fully managed, serverless platform to (not only) run your containerized workloads. It has evolved a lot since March 2021, when…

Sensors, signals and synergy: Enhancing Downer’s data exploration with IBM

3 min read - In the realm of urban transportation, precision is pivotal. Downer, a leading provider of integrated services in Australia and New Zealand, considers itself a guardian of the elaborate transportation matrix, and it continually seeks to enhance its operational efficiency. With over 200 trains and a multitude of sensors, Downer has accumulated a vast amount of data. While Downer regularly uncovers actionable insights from their data, their partnership with IBM® Client Engineering aimed to explore the additional potential of this vast dataset,…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters