April 18, 2019 By Phil Alger 5 min read

IBM Cloud Databases for Elasticsearch coupled with Kibana

You might already have set up an IBM Cloud Databases for Elasticseach deployment, and you now want to use Kibana to visualize your data or run queries using the interactive UI. We’ve got you covered. All you need are your database credentials and Docker to get set up and start searching.

If you use Elasticsearch, you’re more than likely aware of how powerful the database is when coupled with Kibana, the open source tool that lets you add visualization capabilities to your Elasticsearch database. So, if you’re already running IBM Cloud Databases for Elasticsearch, you might consider using it or be wondering how to get started.

We’ve got you covered. In this article, we’ll show you how to connect your Databases for Elasticsearch deployment to Kibana using Docker, which takes just a couple of minutes to get set up.

Let’s get started.

Setting things up

First, make sure you have Docker installed. You’ll want it so that you can pull the Kibana container image to connect to Databases for Elasticsearch.

Next, grab the credentials for your Databases for Elasticsearch deployment. You can either do this from the IBM Cloud console or using the IBM Cloud CLI. In this example, we’ll get the credentials using the IBM Cloud CLI. With the CLI installed, run the following in your terminal if you’re using a federated login account. Otherwise, omit the --sso flag.

ibmcloud login --sso

The next step is to get your Databases for Elasticsearch credentials. This is done using the IBM Cloud CLI’s cloud-databases plugin. If you don’t have this plugin installed, we’ll show you how; otherwise, you can skip this step.

To install the cloud-databases plugin from the IBM Cloud CLI, run the following in your terminal:

ibmcloud plugin install cloud-databases

Once that’s finished installing, you can access any of the cloud-databases commands using ibmcloud cdb. In order to connect Kibana to your database, you’ll need the connection string, username and password, and CA certificate of your Databases for Elasticsearch deployment.

To get the connection string of your deployment, use the CLI and the cloud-databases plugin and run:

ibmcloud cdb cxn <name elasticsearch deployment>

So, if our database is called “Databases for Elasticsearch”, we would run:

ibmcloud cdb cxn "Databases for Elasticsearch"

Make sure to use quotes around the deployment name if your deployment name has spaces in it like in the example above. If you don’t have spaces in the deployment name, you don’t need to use quotes.

After running that command, you’ll get the connection strings to the deployment. In it, you have the username admin, the redacted password, as well as the host and port name of the Elasticsearch deployment.

If you’ve set up your Databases for Elasticsearch, you should have changed the admin password. If not, do that now so you have access to that user.

ibmcloud cdb user-password "Databases for Elasticsearch" admin <new password>

You’ll also need the CA certificate to access the database. You can get that using the following command:

ibmcloud cdb cacert <name elasticsearch deployment>

For example, using the same example database name above:

ibmcloud cdb cacert "Databases for Elasticsearch"

Running this command would give you something like this:


The certificate has been redacted here, but you need to copy everything starting from -----BEGIN CERTIFICATE----- to -----END CERTIFICATE----- into a file and save it to a directory of your choice. You’ll need to reference this directory when we run Kibana from Docker.

Setting up Kibana

Before running a Docker container that includes Kibana, you’ll need to create a configuration file that contains some basic Kibana settings. There are numerous settings for Kibana that you can peruse and add to your configuration file if you need them.

To set up the configuration file, create a YAML file called kibana.yml. Inside this file, you’ll need the following Kibana configuration settings:

elasticsearch.ssl.certificateAuthorities: "/usr/share/kibana/config/cacert"
elasticsearch.username: "admin"
elasticsearch.password: "mypassword"
elasticsearch.url: "https://xxxx.databases.appdomain.cloud:30694"
server.name: "kibana"
server.host: ""

The first setting, elasticsearch.ssl.certificateAuthorities, is the location of your deployment’s CA certificate stored in the container. You can change this to a location of your choice, but we’ve kept it in Kibana’s config directory. Remember this is a location in the Docker container, not your physical system.

The next three settings are your Databases for Elasticsearch username (elasticsearch.username), password (elasticsearch.password), and hostname and port ("https://xxxx.databases.appdomain.cloud:30694). Finally, we have the server.name, which is a machine-readable name for the Kibana instance, and server.host, which is the host of the backend server and where you’ll connect to in your web browser.

Again, the settings above are just an example to get started. See the Kibana documentation for more configuration settings you can set for your use case.

Running the Kibana Container

Now that the kibana.yml file is set up, we’ll show you how to use Docker to attach that file and your CA certificate to the Docker container while pulling the Kibana image from the Docker image repository. The Docker image for the Kibana version we’re using is kibana-oss version 6.5.4, which is the open source version of Kibana without X-Pack.

Below is the Docker command that you’ll run in your terminal to start up the Kibana container:

docker container run -it --name kibana \ 
-v /path/to/kibana.yml:/usr/share/kibana/config/kibana.yml \
-v /path/to/<ca cert file name>:/usr/share/kibana/config/<ca certificate file name>  \
-p 5601:5601 docker.elastic.co/kibana/kibana-oss:6.5.4

I haven’t detached the container because I want to illustrate what the container will look like while running, but you can use the -d flag as an option if you don’t want to see the output of Kibana in your terminal.

You’ll notice in the Docker command above that we have two volumes attached with the -v flag. These are mounted to the Kibana container to the path /usr/share/kibana/config/, which is a configuration directory that Kibana looks at for configuration files. The first volume points to yourkibana.yml file. The file name it assigns in the container must be named kibana.ymlbecause that’s the file name that the Kibana server reads server properties from. The second volume refers to the path on your system of your CA certificate that you saved earlier and makes a copy of that in the /usr/share/kibana/config/ directory in the container as well. This path is also specified in the kibana.yml file as elasticsearch.ssl.certificateAuthorities and the two paths from the Docker command and in the kibana.yml file must be identical so that Kibana knows where your CA certificate is located. The port that’s exposed from the container is 5601 and we’ll access Kibana using that port. Finally, the Kibana image we’ll pull is the kibana-oss version without X-Pack: docker.elastic.co/kibana/kibana-oss:6.5.4.

After configuring the command, run the Docker command from your terminal and it will download the Kibana Docker image and run Kibana. Once Kibana has connected to your Databases for Elasticsearch deployment and is running successfully, you will see output  like the following in your terminal:

log   [01:19:31.839] [info][status][plugin:kibana@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:31.925] [info][status][plugin:elasticsearch@6.5.4] Status changed from uninitialized to yellow - Waiting for Elasticsearch
log   [01:19:32.120] [info][status][plugin:timelion@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:32.134] [info][status][plugin:console@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:32.147] [info][status][plugin:metrics@6.5.4] Status changed from uninitialized to green - Ready
log   [01:19:33.132] [info][status][plugin:elasticsearch@6.5.4] Status changed from yellow to green - Ready
log   [01:19:33.378] [info][listening] Server running at

At this point, you can run the URL in your browser. Remember we set up in the kibana.yml file to access Kibana and exposed the port 5601 from the container. Once you go to the URL, a pop-up window will prompt you for your username and password. These are credentials that have access to your Databases for Elasticsearch deployment. They don’t have to be the same username and password you provided in the kibana.yml file.

From here, you can start using Kibana with Databases for Elasticsearch.


We started out getting your credentials from your IBM Cloud Databases for Elasticsearch deployment, then set up a Kibana configuration file with some basic options that you’ll need to provide in order to connect Kibana to your Elasticsearch database. From there, we run Docker and watched Kibana go live from your browser. In the next article, I’ll take you through the steps to get Kibana set up on IBM Cloud Kubernetes Service.

Was this article helpful?

More from Cloud

Enhance your data security posture with a no-code approach to application-level encryption

4 min read - Data is the lifeblood of every organization. As your organization’s data footprint expands across the clouds and between your own business lines to drive value, it is essential to secure data at all stages of the cloud adoption and throughout the data lifecycle. While there are different mechanisms available to encrypt data throughout its lifecycle (in transit, at rest and in use), application-level encryption (ALE) provides an additional layer of protection by encrypting data at its source. ALE can enhance…

Attention new clients: exciting financial incentives for VMware Cloud Foundation on IBM Cloud

4 min read - New client specials: Get up to 50% off when you commit to a 1- or 3-year term contract on new VCF-as-a-Service offerings, plus an additional value of up to USD 200K in credits through 30 June 2025 when you migrate your VMware workloads to IBM Cloud®.1 Low starting prices: On-demand VCF-as-a-Service deployments begin under USD 200 per month.2 The IBM Cloud benefit: See the potential for a 201%3 return on investment (ROI) over 3 years with reduced downtime, cost and…

The history of the central processing unit (CPU)

10 min read - The central processing unit (CPU) is the computer’s brain. It handles the assignment and processing of tasks, in addition to functions that make a computer run. There’s no way to overstate the importance of the CPU to computing. Virtually all computer systems contain, at the least, some type of basic CPU. Regardless of whether they’re used in personal computers (PCs), laptops, tablets, smartphones or even in supercomputers whose output is so strong it must be measured in floating-point operations per…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters