IBM Cloud Databases for Elasticsearch coupled with Kibana

You might already have set up an IBM Cloud Databases for Elasticseach deployment, and you now want to use Kibana to visualize your data or run queries using the interactive UI. We’ve got you covered. All you need are your database credentials and Docker to get set up and start searching.

If you use Elasticsearch, you’re more than likely aware of how powerful the database is when coupled with Kibana, the open source tool that lets you add visualization capabilities to your Elasticsearch database. So, if you’re already running IBM Cloud Databases for Elasticsearch, you might consider using it or be wondering how to get started.

We’ve got you covered. In this article, we’ll show you how to connect your Databases for Elasticsearch deployment to Kibana using Docker, which takes just a couple of minutes to get set up.

Let’s get started.

Setting things up

First, make sure you have Docker installed. You’ll want it so that you can pull the Kibana container image to connect to Databases for Elasticsearch.

Next, grab the credentials for your Databases for Elasticsearch deployment. You can either do this from the IBM Cloud console or using the IBM Cloud CLI. In this example, we’ll get the credentials using the IBM Cloud CLI. With the CLI installed, run the following in your terminal if you’re using a federated login account. Otherwise, omit the --sso flag.

ibmcloud login --sso
Scroll to view full table

The next step is to get your Databases for Elasticsearch credentials. This is done using the IBM Cloud CLI’s cloud-databases plugin. If you don’t have this plugin installed, we’ll show you how; otherwise, you can skip this step.

To install the cloud-databases plugin from the IBM Cloud CLI, run the following in your terminal:

ibmcloud plugin install cloud-databases
Scroll to view full table

Once that’s finished installing, you can access any of the cloud-databases commands using ibmcloud cdb. In order to connect Kibana to your database, you’ll need the connection string, username and password, and CA certificate of your Databases for Elasticsearch deployment.

To get the connection string of your deployment, use the CLI and the cloud-databases plugin and run:

ibmcloud cdb cxn <name elasticsearch deployment>
Scroll to view full table

So, if our database is called “Databases for Elasticsearch”, we would run:

ibmcloud cdb cxn "Databases for Elasticsearch"
Scroll to view full table

Make sure to use quotes around the deployment name if your deployment name has spaces in it like in the example above. If you don’t have spaces in the deployment name, you don’t need to use quotes.

After running that command, you’ll get the connection strings to the deployment. In it, you have the username admin, the redacted password, as well as the host and port name of the Elasticsearch deployment.

If you’ve set up your Databases for Elasticsearch, you should have changed the admin password. If not, do that now so you have access to that user.

ibmcloud cdb user-password "Databases for Elasticsearch" admin <new password>
Scroll to view full table

You’ll also need the CA certificate to access the database. You can get that using the following command:

ibmcloud cdb cacert <name elasticsearch deployment>
Scroll to view full table

For example, using the same example database name above:

ibmcloud cdb cacert "Databases for Elasticsearch"
Scroll to view full table

Running this command would give you something like this:

Scroll to view full table

The certificate has been redacted here, but you need to copy everything starting from -----BEGIN CERTIFICATE----- to -----END CERTIFICATE----- into a file and save it to a directory of your choice. You’ll need to reference this directory when we run Kibana from Docker.

Setting up Kibana

Before running a Docker container that includes Kibana, you’ll need to create a configuration file that contains some basic Kibana settings. There are numerous settings for Kibana that you can peruse and add to your configuration file if you need them.

To set up the configuration file, create a YAML file called kibana.yml. Inside this file, you’ll need the following Kibana configuration settings:

elasticsearch.ssl.certificateAuthorities: "/usr/share/kibana/config/cacert"
elasticsearch.username: "admin"
elasticsearch.password: "mypassword"
elasticsearch.url: "" "kibana" ""
Scroll to view full table

The first setting, elasticsearch.ssl.certificateAuthorities, is the location of your deployment’s CA certificate stored in the container. You can change this to a location of your choice, but we’ve kept it in Kibana’s config directory. Remember this is a location in the Docker container, not your physical system.

The next three settings are your Databases for Elasticsearch username (elasticsearch.username), password (elasticsearch.password), and hostname and port (" Finally, we have the, which is a machine-readable name for the Kibana instance, and, which is the host of the backend server and where you’ll connect to in your web browser.

Again, the settings above are just an example to get started. See the Kibana documentation for more configuration settings you can set for your use case.

Running the Kibana Container

Now that the kibana.yml file is set up, we’ll show you how to use Docker to attach that file and your CA certificate to the Docker container while pulling the Kibana image from the Docker image repository. The Docker image for the Kibana version we’re using is kibana-oss version 6.5.4, which is the open source version of Kibana without X-Pack.

Below is the Docker command that you’ll run in your terminal to start up the Kibana container:

docker container run -it --name kibana \
-v /path/to/kibana.yml:/usr/share/kibana/config/kibana.yml \
-v /path/to/<ca cert file name>:/usr/share/kibana/config/<ca certificate file name>  \
-p 5601:5601
Scroll to view full table

I haven’t detached the container because I want to illustrate what the container will look like while running, but you can use the -d flag as an option if you don’t want to see the output of Kibana in your terminal.

You’ll notice in the Docker command above that we have two volumes attached with the -v flag. These are mounted to the Kibana container to the path /usr/share/kibana/config/, which is a configuration directory that Kibana looks at for configuration files. The first volume points to yourkibana.yml file. The file name it assigns in the container must be named kibana.ymlbecause that’s the file name that the Kibana server reads server properties from. The second volume refers to the path on your system of your CA certificate that you saved earlier and makes a copy of that in the /usr/share/kibana/config/ directory in the container as well. This path is also specified in the kibana.yml file as elasticsearch.ssl.certificateAuthorities and the two paths from the Docker command and in the kibana.yml file must be identical so that Kibana knows where your CA certificate is located. The port that’s exposed from the container is 5601 and we’ll access Kibana using that port. Finally, the Kibana image we’ll pull is the kibana-oss version without X-Pack:

After configuring the command, run the Docker command from your terminal and it will download the Kibana Docker image and run Kibana. Once Kibana has connected to your Databases for Elasticsearch deployment and is running successfully, you will see output  like the following in your terminal:

log   [01:19:31.839] [info][status][plugin:kibana@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:31.925] [info][status][plugin:elasticsearch@6.5.4] Status changed from uninitialized to yellow - Waiting for Elasticsearch
log   [01:19:32.120] [info][status][plugin:timelion@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:32.134] [info][status][plugin:console@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:32.147] [info][status][plugin:metrics@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:33.132] [info][status][plugin:elasticsearch@6.5.4] Status changed from yellow to green - Ready
log   [01:19:33.378] [info][listening] Server running at
Scroll to view full table

At this point, you can run the URL in your browser. Remember we set up in the kibana.yml file to access Kibana and exposed the port 5601 from the container. Once you go to the URL, a pop-up window will prompt you for your username and password. These are credentials that have access to your Databases for Elasticsearch deployment. They don’t have to be the same username and password you provided in the kibana.yml file.

From here, you can start using Kibana with Databases for Elasticsearch.


We started out getting your credentials from your IBM Cloud Databases for Elasticsearch deployment, then set up a Kibana configuration file with some basic options that you’ll need to provide in order to connect Kibana to your Elasticsearch database. From there, we run Docker and watched Kibana go live from your browser. In the next article, I’ll take you through the steps to get Kibana set up on IBM Cloud Kubernetes Service.

More from Cloud

Modernizing child support enforcement with IBM and AWS

7 min read - With 68% of child support enforcement (CSE) systems aging, most state agencies are currently modernizing them or preparing to modernize. More than 20% of families and children are supported by these systems, and with the current constituents of these systems becoming more consumer technology-centric, the use of antiquated technology systems is archaic and unsustainable. At this point, families expect state agencies to have a modern, efficient child support system. The following are some factors driving these states to pursue modernization:…

7 min read

IBM Cloud Databases for Elasticsearch End of Life and pricing changes

2 min read - As part of our partnership with Elastic, IBM is announcing the release of a new version of IBM Cloud Databases for Elasticsearch. We are excited to bring you an enhanced offering of our enterprise-ready, fully managed Elasticsearch. Our partnership with Elastic means that we will be able to offer more, richer functionality and world-class levels of support. The release of version 7.17 of our managed database service will include support for additional functionality, including things like Role Based Access Control…

2 min read

Connected products at the edge

6 min read - There are many overlapping business usage scenarios involving both the disciplines of the Internet of Things (IoT) and edge computing. But there is one very practical and promising use case that has been commonly deployed without many people thinking about it: connected products. This use case involves devices and equipment embedded with sensors, software and connectivity that exchange data with other products, operators or environments in real-time. In this blog post, we will look at the frequently overlooked phenomenon of…

6 min read

SRG Technology drives global software services with IBM Cloud VPC under the hood

4 min read - Headquartered in Ft. Lauderdale, Florida, SRG Technology LLC. (SRGT) is a software development company supporting the education, healthcare and travel industries. Their team creates data systems that deliver the right data in real time to customers around the globe. Whether those customers are medical offices and hospitals, schools or school districts, government agencies, or individual small businesses, SRGT addresses a wide spectrum of software services and technology needs with round-the-clock innovative thinking and fresh approaches to modern data problems. The…

4 min read