Learn how event-driven architecture works and how it contributes to improved flexibility, reliability, capability, and performance of cloud native applications.
What is event-driven architecture?
Event-driven architecture is an integration model built around the publication, capture, processing, and storage (or persistence) of events. Specifically, when an application or service performs an action or undergoes a change that another application or service might want to know about, it publishes an event—a record of that action or change—that another application or service can consume and process to perform one or more actions in turn.
Event-driven architecture enables a loose coupling between connected applications and services—they can communicate with each other by publishing and consuming events without knowing anything about each other except the event format. This model offers significant advantages over a request/response architecture (or integration model), in which one application or service must request specific information from another specific application or service that is expecting the specific request.
Event-driven architecture maximizes the potential of cloud native applications and enables powerful applications technologies, such as real-time analytics and decision support.
How does event-driven architecture work?
In an event-driven architecture, applications act as event producers or event consumers (and often as both).
An event producer transmits an event—in the form of a message—to a broker or some other form of event router, where the event’s chronological order is maintained relative to other events. An event consumer ingests the message—in real-time (as it occurs) or at any other time it wants—and processes the message to trigger another action, workflow, or event of its own.
In a simple example, a banking service might transmit a ‘deposit’ event, which another service at the bank would consume and respond to by writing a deposit to the customer’s statement. But event-driven integrations can also trigger real-time responses based on complex analysis of huge volumes of data, such as when the ‘event’ of a customer clicking a product on an e-commerce site generates instant product recommendations based on other customers’ purchases.
Event-driven architecture messaging models
There are two basic models for transmitting events in an event-driven architecture.
Event messaging or publish/subscribe
In the event messaging or publish/subscribe model, event consumers subscribe to a class or classes of messages published by event producers. When an event producer publishes an event, the message is sent directly to all subscribers who want to consume it.
Typically, a broker handles the transmission of event messages between publishers and subscribers. The broker receives each event message, translates it if necessary, maintains its order relative to other messages, makes them available to subscribers for consumption, and then deletes them once they are consumed (so that they cannot be consumed again).
In the event streaming model, event producers publish streams of events to a broker. Event consumers subscribe to the streams, but instead of receiving and consuming every event as it is published, consumers can step into each stream at any point and consume only the events they want to consume. The key difference here is that the events are retained by the broker even after the consumers have received them.
A data streaming platform, such as Apache Kafka, manages the logging and transmission of tremendous volumes of events at very high throughput (literally trillions of event records per day, in real-time, without performance lag). A streaming platform offers certain characteristics a message broker does not:
- Event persistence: Because consumers may consume events at any time after they are published, event streaming records are persistent—they are maintained for a configurable amount of time, anywhere from fractions of a second to forever. This enables event stream applications to process historical data, as well as real-time data.
- Complex event processing: Like event messaging, event streaming can be used for simple event processing, in which each published event triggers transmission and processing by one or more specific consumers. But, it can also be used for complex event processing, in which event consumers process entire series of events and perform actions based on the result.
Benefits of event-driven architecture
Compared to the request/response application architecture, event-driven architecture offers several advantages and opportunities for developers and organizations:
- Powerful real-time response and analytics: Event streaming enables applications that respond to changing business situations as they happen and make predictions and decisions based on all available current and historical data in real-time. This has benefits in any number of areas—from processing streams of data generated by myriad IoT devices, to predicting and squashing security threats on the fly, to automating supply chains for optimal efficiency.
- Fault tolerance, scalability, simplified maintenance, versatility, and other benefits of loose coupling: Applications and components in an event-driven article aren’t dependent on each other’s availability; they can be independently updated, tested, and deployed without interruption of service, and when one component goes down a backup can be brought online. Event persistence enables ‘replaying’ of past events, which can help recover data or functionality there is an event consumer outage. Components can be scaled easily and independently of each other across the network, and developers can revise or enrich applications and systems by adding and removing event producers and consumers.
- Asynchronous messaging: Event-driven architecture enables components to communicate asynchronously—producers publish event messages, on their own schedule, without waiting for consumers to receive them (or even knowing if consumers received them). In addition to simplifying integration, this improves the application experience for users. A user completing a task in one component can move on to the next task without waiting, regardless of any downstream integrations between that component and others in the system.
Event-driven architecture and microservices
In microservices—a foundational cloud native application architecture—applications are assembled from loosely coupled, independently deployable services. The main benefits of microservices are essentially the benefits of loose coupling—ease of maintenance, flexibility of deployment, independent scalability, and fault tolerance.
Not surprisingly, event-driven architecture is widely considered best practice for microservices implementations. Microservices can communicate with each other using REST APIs. But REST, a request/response integration model, undermines many of the benefits of the loosely coupled microservices architecture by forcing a synchronous, tightly coupled integration between the microservices.
Event-driven architecture and IBM
Deploying an event-driven architecture is crucial for organizations looking to automate and optimize their application workflows and supporting business processes. However, knowing where to start can sometimes be a challenge.
IBM Event Streams is an event-streaming platform, built on open-source Apache Kafka, that is designed to simplify the automation of mission critical workloads. Using IBM Event Streams, organizations can quickly deploy enterprise grade event-streaming technology. There are multiple deployment options to choose from, whether as certified container software running on the OpenShift Container Platform or a fully-managed Apache Kafka-as-a-service solution.
IBM Db2 Event Store is a memory-optimized, AI-enabled database designed to rapidly ingest and analyze streamed data for event-driven applications. With IBM Watson Studio built in, you can build machine learning into your analytics, enabling you to act more quickly on data from streaming sources. Available in both a developer and enterprise edition, Db2 Event Store is also included as a data source service in IBM Cloud Pak for Data, a fully integrated, multicloud data and AI platform.
Another streaming analytics tool is IBM Streams, which can efficiently process and analyze huge volumes and varieties of data streams from diverse sources with low latency. It is available on premises, on IBM Cloud and is included with IBM Cloud Pak for Data.
Sign up for an IBMid and create your IBM Cloud account.