Hybrid Deployments

Enhanced Access Control on Apache Kafka Using Event Streams for IBM Cloud

Share this post:

IBM Event Streams on IBM Cloud is now Identity and Access Management enabled

But, what does that mean?

Identity and Access Management (IAM) is the mechanism IBM Cloud uses to control access by Users or Applications to Resources on the cloud. The Enterprise plan of IBM Event Streams on IBM Cloud now takes advantage of IAM integration to provide flexible and fine-grained access control to your Kafka resources.

Identity and Access Management (IAM)

Let’s first do a quick overview of IAM concepts.

IAM boils down into three layers:

  1. Users and Applications: Subjects that require access to Resources.
  2. Resources: Things you want to gain access to.
  3. Access Policies: Sometimes referred to as Roles, these are rules that define how much access you give the Subjects to those Resources.

Let’s look at each in turn.

Users and Applications

You will notice from the diagram that IAM uses the User ID to represent a user. For an Application, IAM uses the concept called a Service ID to allow Applications to connect to Resources.

One of the cool things about a Service ID is that it allows an Application outside of IBM Cloud to access your IBM Cloud services.

The other thing to note from the diagram is the concept of an Access Group. An Access Group is a way to group User IDs and Service IDs together and grant them group-wide access rights. For example, you might want all users or Service IDs in the HR department to be in one HR Access Group.


Resources are artifacts that a User or Application wants to access. So, on IBM Cloud, a Resource might be an Instance of a service that your organisation has created. For example, an Instance of Event Streams.

Additionally, a Resource might be a more granular entity within a service, like a Topic in Event Streams. These are referred to as Service Resource Types.

A Resource Group is a way to group a set of Resources together. For example, your HR department might have a group of Service Instances they have created for their own use. These can all be placed into an HR Resource Group. This way you grant access to a group of Resources rather than each individual Resource.

Currently, only Service Instances can be put into a resource group.

Access Policies

Once you have defined the above, you can then apply Access Policies, which dictate how much access a User or Application has to a Resource.

Access Policies can seem a bit intimidating at first. They are also called Roles.

One important concept that simplifies things is that there are two levels of Access Policies/Roles:

  1. Platform Management Roles: An account owner assigns these Roles to Users to allow them the ability to create platform-wide Resources—for example, the ability to create an Event Streams Instance or the ability to create and add Users to Access Groups.
  2. Service Management Roles: Once a Service Instance is created, these roles define the Access rights for Resources within a service.

Let’s look at an actual example to illustrate using Event Streams.

Example 1: Delegating ownership of an Event Streams Instance

The account owner for ACME corporation (Brenda) has an account on IBM Cloud. The IT department wants to create a solution to monitor web click traffic on the sales site. They have decided to use Event Streams and a number of other services. The account owner delegates responsibility to the IT lead (Tim) by granting him the Platform Role of Editor. 

Tim now has the permission to create Service Instances. So, he creates a number of Services Instances, including an Event Streams Instance.

Now we are in the realm of the Service Management Role. Remember that these are the roles that are relevant to the more granular resources that a service provides. There are three Service Management roles:

  1. Reader: These Users can access but cannot create, edit, or change Service Instance Resources.
  2. Writer: These Users can do everything a Reader can do as well as create, edit, read, and write to Service Instance Resources.
  3. Manager: These Users can do pretty much anything with the Service Instance

By default, upon creation of the Event Streams instance, Tim is granted a Manager role for that Service Instance.

The management of this Event Streams Service Instance and the underlying Resources can now be managed solely by Tim without bothering Brenda.

Event Streams Service-level resources

OK, got it, but what are those more granular Resources (called Resource Types) that Event Streams provides?

More importantly, what can Tim do with them?

Event Streams is built on Apache Kafka, so you won’t be surprised to hear that the Resource Types which Event Streams offer map directly to some fundamental Kafka concepts:

  • Cluster: You can control which Applications and Users can connect to the Kafka cluster.
  • Topics: You can control the ability of Users and Applications to create, delete, read, and write to a topic.
  • Consumer groups: You can control an Application’s ability to join a consumer group.
  • Producer transactions: You can control the ability to use the transactional producer capability in Kafka (i.e., single, atomic writes across multiple partitions).

So, back to that question: What can you do with the resources?

Again, it is best to use some examples.

Example 2: Isolating Event Streams resources to different departments

Tim (remember he’s the IT lead) wants to allow different departments to use Event Streams. One way to do this would be to ask Brenda to create different Event Streams Instances for different departments and then assign access rights to each department to the different instances.

That is great and ensures a good level of isolation, but Tim has the idea that one day the departments might want to cross-collaborate. For that reason, he decides on a single Event Streams Instance.

As Brenda previously granted Tim the Editor platform role, Tim can create a Service Instance of Event Streams.

He has three departments asking to use Event Streams: HR, Sales, and Marketing.

Tim creates an Access Group for each department and adds the User IDs of that department to the respective Access Group. At the same time, Service IDs can be created for each Department and added to the respective Access Groups to allow for Applications written by that department to use the Instance.

Tim now has a choice. He can either delegate all Resource management of the Event Streams Instance, delegate some Resource management of the Event Streams Instance, or keep control on the Instance himself.

Let’s show what Tim needs to grant to the access groups for the departments for each of these options:

  • If he wants to delegate full control, Tim grants Manager role on the “Cluster” Resource to the department Access Groups.
  • If he wants to restrict delegation of control to just Topics, he grants Reader role on the “Cluster” Resource and Manager role on the Resource type “Topic” to the department Access Groups.
  • If he wants to keep control himself and not delegate, he grants only Reader role on the “Cluster” Resource to the department Access Groups.

Tim decides on the third option of keeping control himself, which means he will manage all Resources (such as Topic creation).

Tim then creates some Topics for each department:

  • HR1, HR2, HR3
  • Sales1, Sales2, Sales3
  • Mark1, Mark2, Mark3

The Reader role allows a User to consume only, so to allow the specific departments the ability to both produce/consume to the Topics, Tim then assigns the following:

  • Writer role to the HR Access Group for the HR “Topic” Resources only
  • Writer role to the Sales Access Group for the Sales “Topic” Resources only
  • Writer role to the Marketing Access Group for the Mark “Topic” Resources only

Each department now only has access to its own Topics—effectively their own slice of an Event Streams Instance.

Example 3: Sharing topics between departments

Time moves on (note: Time moves on, not Tim; he is still there), and the department lead for Marketing approaches Tim. They want to be able to consume some information that is stored in the Sales Topics. After an agreement with the Sales department, Tim is able to grant the Reader role to the Marketing Access group for the Sales Topics, “Sales1.”

This allows the Marketing group to consume events from that Topic but not produce messages to it.

Additionally, the Sales team wants to feed sales information into a Marketing Topic. So, Tim grants Writer role to the Sales Access Group for the Marketing Topic “Mark2.”

This allows sales to produce events to the Marketing Topic for consumption by the marketing team.

So, not only is isolation achievable, cross-collaboration can easily be enabled.


As you can see, the combination of Event Streams Resources and the rich IAM features of IBM Cloud allows for very fine-grained and flexible access control to your Event Streams Instances and Resources.

Learn more here.

Get started on Event Streams for IBM Cloud.

Event Streams for IBM Cloud Program Director

More Hybrid Deployments stories
April 18, 2019

Load Balancing API Calls Across Regions with IBM Cloud Internet Services and Cloud API Gateway

In this article, we'll explore load balancing traffic across two geographically-separated backends built on IBM Cloud Functions. We'll use the IBM Cloud API Gateway to deploy the same API definition in both regions, and then intelligently route traffic with IBM Cloud Internet Services.

Continue reading

April 10, 2019

Improving Your Event Streaming Experience with IBM Event Streams v2019.1.1

We're excited to announce the availability of IBM Event Streams v2019.1.1, another step in our goal to provide clients with the most efficient way to deploy and run Apache Kafka (while leveraging their existing IT systems) so that so they can ultimately build more differentiated, intelligent, event-driven experiences for their customers.

Continue reading

April 2, 2019

Kafka Monthly Digest: March 2019

Here's what happened in the Kafka community in March 2019.

Continue reading