November 11, 2016 By Nick Maynard
Ilene Seelemann
3 min read

Building a DevOps pipeline for an API Connect and Microservices architecture

There are many lessons to learn from a DevOps implementation in a greenfield development project, where 20 new cloud native microservices were built and deployed to Bluemix. Our client’s business partners created end-user applications, driving mindshare and loyalty to our client’s brand.

In this post, we’ll talk about how we used API Connect to socialize, document, and govern access to these microservices consumed by business partners. We’ll examine how we used them to build high value web and mobile applications that leveraged the data and function provided by these microservices. We’ll also show how we evolved our development practice using Agile and DevOps techniques to streamline the deployment of these microservices and vastly reduce the time to production of updates while reducing risk.

Understanding the challenges to manage the solution

We dealt with a few issues in order to manage the solution:

  1. Management of build artifacts

  2. Testing microservice versions for promotion

  3. Configuring microservice environments

  4. Deploying artifacts to live environments while maximizing uptime

  5. Packaging APIs into meaningful, controllable sets

  6. Managing versioning and microservice compatibility (interoperability)

  7. Aligning the presented API layer with the microservice deployments

  8. Monitoring state

The surface area of the solution included approximately 20 microservices. These were deployed across the four environments, databases in each environment, and API Connect with its configuration of APIs and products. While 20 microservices may seem relatively small, the number of microservices and APIs across the environments exceeded 100 moving parts to be managed consistently in a repeatable and predictable manner. With the project’s rapid changes, this couldn’t be done efficiently without automation—even with a dedicated, skilled operations team.

Topology of microservices and API Connect on Bluemix

Our architecture consisted of three main layers:

  1. API Connect providing governance and a consistent access point for all microservices

  2. A set of microservices providing domain-specific features for clients and interfacing with third-party services

  3. Data storage in dashDB and IBM Cloudant

Get started with IBM API Connect.

This architecture was completely replicated in four environments:

  1. Development: Tracking the latest “development” code versions

  2. QA: The latest code release for integration test runs

  3. Pre-production: For partner integration and testing

  4. Production

We modeled these environments as separate Bluemix spaces within a single Bluemix organization. We provided project members with per-space permissions appropriate to their roles, allowing isolation of concerns and confidentiality of production information.

We provisioned a single API Connect instance in a space isolated from the environment-specific spaces. This allowed us to handle all API configurations, management, deployment, and monitoring from this central instance of API Connect. The API products defined in API Connect were parameterized so that they proxied to microservices in each of the environments. As microservices moved through the development life cycle, the APIs that provided access to the microservices aligned with and pointed to the microservices.

How we structured our DevOps toolchain

Our toolchain consisted of the following technologies:

We used Flowdock for team communication and for build and deploy notifications. We used Rally for tracking all work items for the team. Every microservice had its own GitHub repository. Commits to a repository kicked off a Jenkins build, which created both microservice artifacts (WAR files, Bluemix manifests) and API Connect configuration artifacts (YAML files). UrbanCode Deploy (UCD) deployed microservices and APIs to Bluemix, which provided the runtime environment. We used Runscope for health monitoring of the APIs, microservices, and some downstream components. API Connect provided valuable insight into the stability and performance of the microservices, which highlighted opportunities for improvements around response time and latency.

In future articles, we will focus on the details of the Code, Deliver, and Run steps in our toolchain and show how they addressed the challenges listed.

This is a complex system. To help you better understand it, we’ve split up what we’ve learned. Stay tuned for the next in a series of articles covering the following topics:

  • Versioning and Swagger for documentation and code skeleton generation

  • Deep dive into API DevOps

  • Deep dive into microservice DevOps

Learn more about microservices on Bluemix.

Sign up for Bluemix. It’s free!

Was this article helpful?
YesNo

More from Cloud

IBM Cloud Virtual Servers and Intel launch new custom cloud sandbox

4 min read - A new sandbox that use IBM Cloud Virtual Servers for VPC invites customers into a nonproduction environment to test the performance of 2nd Gen and 4th Gen Intel® Xeon® processors across various applications. Addressing performance concerns in a test environment Performance testing is crucial to understanding the efficiency of complex applications inside your cloud hosting environment. Yes, even in managed enterprise environments like IBM Cloud®. Although we can deliver the latest hardware and software across global data centers designed for…

10 industries that use distributed computing

6 min read - Distributed computing is a process that uses numerous computing resources in different operating locations to mimic the processes of a single computer. Distributed computing assembles different computers, servers and computer networks to accomplish computing tasks of widely varying sizes and purposes. Distributed computing even works in the cloud. And while it’s true that distributed cloud computing and cloud computing are essentially the same in theory, in practice, they differ in their global reach, with distributed cloud computing able to extend…

How a US bank modernized its mainframe applications with IBM Consulting and Microsoft Azure

9 min read - As organizations strive to stay ahead of the curve in today's fast-paced digital landscape, mainframe application modernization has emerged as a critical component of any digital transformation strategy. In this blog, we'll discuss the example of a US bank which embarked on a journey to modernize its mainframe applications. This strategic project has helped it to transform into a more modern, flexible and agile business. In looking at the ways in which it approached the problem, you’ll gain insights into…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters