Utilizing IAM capabilities for enhanced security.
In the past, I have blogged about how to access a Db2 database on IBM Cloud from your local machine. I cataloged the cloud host and its database. Then, connected to it using the command line processor (CLP) by passing a username and password.
Last year, I was delighted to read that Db2 on Cloud and Db2 Warehouse on Cloud now support authentication by either IBMid, API key, or access token—it is called IAM (Identity and Access Management) support. I recently had some time to actually use it, and these are the steps I took to connect from my local machine to IBM Db2 on Cloud using either API key or token.
In order to connect to a Db2 database, I needed a Db2 client. Since working more and more with cloud environments and containers, I typically do not have a Db2 server or client installed. Thus, for testing, I installed the IBM Data Server Runtime Client and, separately, also used the official IBM Db2 Warehouse client container. As it turned out, it was a good choice.
Once available, I tested the CLP for CONNECT syntax, and both environments returned the following:
CLP supports APIKEY and ACCESSTOKEN, but it is not reflected in the docs for all Db2 product editions.
Cataloging the database
To make the remote database known to CLP, I first needed to catalog the remote host (TCPIP NODE), then the actual database:
The above adds information about a node name Db2oCfra to the local node directory. The important part is to use port (SERVER) 50001 and to explicitly specify that it uses SSL. For security reasons, IAM support is restricted to SSL connections only. If port 50000 is used or if you forget to specify "security ssl", you are going to run into all kinds of error messages. Trust me on that...
The following command adds the remote database "bludb" with the alias "fradb" to the local database directory. Note that the authentication type is set to GSSPLUGIN to allow for login via IBMid:
During my attempts to make IAM-based authentication work, I ran into a couple problems. The command ipclean helped me to reset/refresh the communication infrastructure and make sure I didn't run into some caching issues.
Connect to Db2 using an API key
With the above setup in place, I added my IBM Cloud userid to my Db2 instance. It can be done in the browser console (tested) or by REST API. Next, I generated a new API key using the IBM Cloud CLI. Then, I ran the tests to connect to the database using the API key:
It succeeded in the client container, but failed in my local Db2 client environment. What was different? I found that the SSL support was not set up. I needed to install GSKit (through my package manager or, alternatively, by downloading), then configure SSL support for use with CLP:
- Create a database for the encryption key:
- Add the root certificate to the key database. The certificate is part of the Db2 client install:
- Configure the instance to use the key database and stash file for SSL-based client connections:
With the above configured, I was able to successfully connect from my Db2 client environment.
Connect to Db2 using a bearer access token
An access token is a bearer token (actually a JSON Web Token, JWT) which can be generated based on an API key:
The returned token can then be used to authenticate against Db2:
The token is valid for 60 minutes after it has been issued, so keep that in mind when testing. The established connection is not impacted by an expiring token, but when you try to connect using an expired token, it gives the following error message:
Getting a proper (security) configuration takes the right order of baby steps. With the right instructions and experience, it is accomplished quickly and easily. Once done, it is possible to connect to Db2 using either a platform API key of IBM Cloud or a generated access token, replacing traditional username and password. It could be used to share an automatically expiring token (for a test instance) or to pass an API key that gets revoked at a later point. The IAM support not only adds more authentication options, but also new items to the security toolbox. See "Secure Credential Brokering in Analytic Jobs on Cloud Data Sources" on how they could be used with the SQL Query service.
But wait, there is more...
A collection of error messages
For the sake of documentation and finding help in this blog post, here are some error messages I ran into in setting up my environments:
- SQL30081N: A communication error has been detected. Communication protocol being used: "SSL". Communication API being used: "SOCKETS". Location where the error was detected: "". Communication function detecting the error: "sqlccSSLSocketSetup". Protocol specific error code(s): "414", "*", "*". SQLSTATE=08001
- SQL10013N: The specified library "GSKit Error: 100" could not be loaded. SQLSTATE=42724
- SQL30082N: Security processing failed with reason "25" ("CONNECTION DISALLOWED"). SQLSTATE=08001
- SQL30082N: Security processing failed with reason "24" ("USERNAME AND/OR PASSWORD INVALID"). SQLSTATE=08001
- SQL30082N: Security processing failed with reason "27" ("INVALID SERVER CREDENTIAL"). SQLSTATE=08001
- SQL30081N: A communication error has been detected. Communication protocol being used: "TCP/IP". Communication API being used: "SOCKETS". Location where the error was detected: "xxx.yyy.yyy.xxx". Communication function detecting the error: "recv". Protocol specific error code(s): "104", "*", "0". SQLSTATE=08001