July 27, 2022 By Henrik Loeser 3 min read

Use context-based restrictions in IBM Cloud to define and enforce access restrictions based on properties from where the access request originated.

Last year, IBM Cloud introduced context-based restrictions (CBRs). Context-based restrictions provide the ability to define and enforce access restrictions based on the network location (“context”) of access requests. Contexts are defined by the type of endpoint or a network zone (or a combination of those). A rule links a context to a resource. When the rule is enabled, only requests that match the context are allowed to proceed to other authorization checks. Because CBRs complement Identity and Access Management (IAM) policies, they are an important building block towards a zero trust architecture. Even if your credentials are leaked or IAM policies misconfigured, CBRs still enforce access and may scope allowed requests to your compute resources or corporate networks.

In this blog post, I am going to discuss such a scenario. I wanted only my code deployed to IBM Cloud Code Engine to access the files in my IBM Cloud Object Storage (COS) bucket and to block traffic from on-prem and other environments. I deployed the CBR network zone (context) and the rule using Terraform. In the following, I am providing more details on the test scenario and showing how I deployed the CBR access control. You can find the details and related code on GitHub in my repository context-based-restrictions:

Access to your cloud storage buckets

For my test of enforcing access control with context-based restrictions, I set up a new storage bucket cbrbucket in my S3-compatible instance of Cloud Object Storage (COS). After uploading a few files, I locally ran the Python script from the mentioned GitHub repository to first list all available buckets on that COS instance, then the items in the bucket cbrbucket. It succeeded and showed 10 buckets and 2 files. 

Then, I containerized the Python script and turned it into a Code Engine job. Once ready and configured, I submitted a Code Engine jobrun. The following screenshot shows the entries from Log Analysis. The output is similar to that from running it locally:

Output from the Python script as a log in Log Analysis.

Network zones and rules for access restrictions

With the test scripting in place, the next step is to define the context-based restriction (CBR) network zone and rule for my scenario. Because I love automation, I created a Terraform script. It reads in the metadata for my COS instance and some account information, then sets up the CBR network zone followed by the rule. The network zone specifies all traffic originating from the Code Engine service, regardless of the endpoint type. The rule is defined as enabled and ties the previously created network zone to my COS instance and the bucket cbrbucket. Thus, only traffic coming from Code Engine is allowed to access the bucket.

When creating CBR network zones and rules, it is important to know that the changes are eventually consistent. So if you run tests immediately after deploying CBR changes, they might not be in place worldwide.

After I created the network zone and rule for Code Engine and the COS bucket and they became active, I submitted a new Code Engine jobrun. The result was similar to the screenshot above — access was still possible. However, running the Python script locally resulted in an access error like the one shown below:

With context-based restrictions in place, my on-prem access is denied.

Later, I added my own IP address to the network zone and re-ran my local test. This time, it succeeded as expected. Similarly, you could add your corporate network (gateway) or other trusted networks to network zones.


Context-based restrictions help to create an extra layer of security checks for your cloud-based solution. By defining the right network zones and implementing access rules for your deployed services, context-based restrictions enable you on your journey towards a zero trust architecture. To get started, I recommend the following resources:

If you have feedback, suggestions, or questions about this post, please reach out to me on Twitter (@data_henrik) or LinkedIn

More from Cloud

Get ready for change with IBM Cloud Training

2 min read - As generative AI creates new opportunities and transforms cloud operations, it is crucial to learn how to maximize the value of these tools. A recent report from the IBM Institute for Business Value found that 68% of hybrid cloud users already have a formal, organization-wide policy or approach for the use of generative AI. That same report also noted that 58% of global decision makers say that cloud skills remain a considerable challenge. Being proactive in your learning can significantly…

Data center consolidation: Strategy and best practices

7 min read - The modern pace of data creation is staggering. The average organization produces data constantly—perhaps even continuously—and soon it’s investing in servers to provide ample storage for that information. In time, and probably sooner than expected, the organization accrues more data and outgrows that server, so it invests in multiple servers. Or that company could tie into a data center, which is built to accommodate even larger warehouses of information. But the creation of new data never slows for long. And…

Hybrid cloud examples, applications and use cases

7 min read - To keep pace with the dynamic environment of digitally-driven business, organizations continue to embrace hybrid cloud, which combines and unifies public cloud, private cloud and on-premises infrastructure, while providing orchestration, management and application portability across all three. According to the IBM Transformation Index: State of Cloud, a 2022 survey commissioned by IBM and conducted by an independent research firm, more than 77% of business and IT professionals say they have adopted a hybrid cloud approach. By creating an agile, flexible and…

Tokens and login sessions in IBM Cloud

9 min read - IBM Cloud authentication and authorization relies on the industry-standard protocol OAuth 2.0. You can read more about OAuth 2.0 in RFC 6749—The OAuth 2.0 Authorization Framework. Like most adopters of OAuth 2.0, IBM has also extended some of OAuth 2.0 functionality to meet the requirements of IBM Cloud and its customers. Access and refresh tokens As specified in RFC 6749, applications are getting an access token to represent the identity that has been authenticated and its permissions. Additionally, in IBM…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters