Event-Driven Architecture
Black and blue background
Event-Driven Architecture

Learn how event-driven architecture works and how it contributes to improved flexibility, reliability, capability, and performance of cloud native applications.

What is event-driven architecture?

Event-driven architecture is an integration model built around the publication, capture, processing, and storage (or persistence) of events. Specifically, when an application or service performs an action or undergoes a change that another application or service might want to know about, it publishes an event—a record of that action or change—that another application or service can consume and process to perform one or more actions in turn.

Event-driven architecture enables a loose coupling between connected applications and services—they can communicate with each other by publishing and consuming events without knowing anything about each other except the event format. This model offers significant advantages over a request/response architecture (or integration model), in which one application or service must request specific information from another specific application or service that is expecting the specific request.

Event-driven architecture maximizes the potential of cloud native applications and enables powerful applications technologies, such as real-time analytics and decision support.

How does event-driven architecture work?

In an event-driven architecture, applications act as event producers or event consumers (and often as both).

An event producer transmits an event—in the form of a message—to a broker or some other form of event router, where the event’s chronological order is maintained relative to other events. An event consumer ingests the message—in real-time (as it occurs) or at any other time it wants—and processes the message to trigger another action, workflow, or event of its own.

In a simple example, a banking service might transmit a ‘deposit’ event, which another service at the bank would consume and respond to by writing a deposit to the customer’s statement. But event-driven integrations can also trigger real-time responses based on complex analysis of huge volumes of data, such as when the ‘event’ of a customer clicking a product on an e-commerce site generates instant product recommendations based on other customers’ purchases.

Event-driven architecture messaging models

There are two basic models for transmitting events in an event-driven architecture.

Event messaging or publish/subscribe

In the event messaging or publish/subscribe model, event consumers subscribe to a class or classes of messages published by event producers. When an event producer publishes an event, the message is sent directly to all subscribers who want to consume it.

Typically, a broker handles the transmission of event messages between publishers and subscribers. The broker receives each event message, translates it if necessary, maintains its order relative to other messages, makes them available to subscribers for consumption, and then deletes them once they are consumed (so that they cannot be consumed again).

Event streaming

In the event streaming model, event producers publish streams of events to a broker. Event consumers subscribe to the streams, but instead of receiving and consuming every event as it is published, consumers can step into each stream at any point and consume only the events they want to consume. The key difference here is that the events are retained by the broker even after the consumers have received them.

A data streaming platform, such as Apache Kafka, manages the logging and transmission of tremendous volumes of events at very high throughput (literally trillions of event records per day, in real-time, without performance lag). A streaming platform offers certain characteristics a message broker does not:

  • Event persistence: Because consumers may consume events at any time after they are published, event streaming records are persistent—they are maintained for a configurable amount of time, anywhere from fractions of a second to forever. This enables event stream applications to process historical data, as well as real-time data.
  • Complex event processing: Like event messaging, event streaming can be used for simple event processing, in which each published event triggers transmission and processing by one or more specific consumers. But, it can also be used for complex event processing, in which event consumers process entire series of events and perform actions based on the result.

Benefits of event-driven architecture

Compared to the request/response application architecture, event-driven architecture offers several advantages and opportunities for developers and organizations:

  • Powerful real-time response and analytics: Event streaming enables applications that respond to changing business situations as they happen and make predictions and decisions based on all available current and historical data in real-time. This has benefits in any number of areas—from processing streams of data generated by myriad IoT devices, to predicting and squashing security threats on the fly, to automating supply chains for optimal efficiency.
  • Fault tolerance, scalability, simplified maintenance, versatility, and other benefits of loose coupling: Applications and components in an event-driven article aren’t dependent on each other’s availability; they can be independently updated, tested, and deployed without interruption of service, and when one component goes down a backup can be brought online. Event persistence enables ‘replaying’ of past events, which can help recover data or functionality there is an event consumer outage. Components can be scaled easily and independently of each other across the network, and developers can revise or enrich applications and systems by adding and removing event producers and consumers.
  • Asynchronous messaging: Event-driven architecture enables components to communicate asynchronously—producers publish event messages, on their own schedule, without waiting for consumers to receive them (or even knowing if consumers received them). In addition to simplifying integration, this improves the application experience for users. A user completing a task in one component can move on to the next task without waiting, regardless of any downstream integrations between that component and others in the system.


Event-driven architecture and microservices

In microservices—a foundational cloud native application architecture—applications are assembled from loosely coupled, independently deployable services. The main benefits of microservices are essentially the benefits of loose coupling—ease of maintenance, flexibility of deployment, independent scalability, and fault tolerance.

Not surprisingly, event-driven architecture is widely considered best practice for microservices implementations. Microservices can communicate with each other using REST APIs. But REST, a request/response integration model, undermines many of the benefits of the loosely coupled microservices architecture by forcing a synchronous, tightly coupled integration between the microservices.

Event-driven architecture and IBM

Deploying an event-driven architecture is crucial for organizations looking to automate and optimize their application workflows and supporting business processes. It can also be an important part of successful application modernization as the demand for better customer experiences and more applications impacts business and IT operations.

When it comes to meeting such demands, a move toward greater automation helps. Ideally, such a move would start with small, measurably successful projects, which you can then scale and optimize for other processes and in other parts of your organization.

Working with IBM, you’ll have access to AI-powered automation capabilities, including prebuilt workflows, to help accelerate innovation by making every process more intelligent.

Take the next step:

  • Learn about IBM Cloud Pak for Data, a fully integrated, multicloud data and AI platform that includes IBM® Db2® Event Store, a memory-optimized, AI-enabled database designed to rapidly ingest and analyze streamed data for event-driven applications.
  • Check out IBM Streams, another streaming analytics tools that can efficiently process and analyze huge volumes and varieties of data streams from diverse sources with low latency. It’s available on premises, on IBM Cloud and is included with IBM Cloud Pak for Data.
  • Take our integration maturity assessment to evaluate your integration maturity level across critical dimensions and discover the actions you can take to get to the next level.
  • Download our agile integration guide, which explores the merits of a container-based, decentralized, microservices-aligned approach for integrating solutions. 
  • Read about the five “must-have’s” for automation success (link resides outside ibm.com) in this HFS Research report.

Get started with an IBM Cloud account today.

Related solutions

AI-Powered Automation

From your business workflows to your IT operations, we’ve got you covered with AI-powered automation. Discover how leading companies are transforming.

Modernize applications for interoperability

Build, modernize and manage applications securely across any cloud with confidence.

IBM Cloud Pak for Integration

Better application speed and quality — that's the brilliance of AI-powered integration.

IBM Cloud Pak for Data

IBM Cloud Pak for Data is an open, extensible data platform that provides a data fabric to make all data available for AI and analytics, on any cloud.

IBM Streams

IBM Streams is an advanced analytic platform used to ingest, analyze and correlate information from different data sources in real time.