November 22, 2021 By Frederic Lavigne 2 min read

How to use LogDNA provider for Terraform to define custom views.

IBM Log Analysis and IBM Cloud Activity Tracker are two critical components under the Observability category in IBM Cloud

They can be used to collect, filter, search and tail log data, define alerts and design custom views to monitor application and system logs. Both are based on the same underlying technology.

As more services and applications start sending logs to IBM Log Analysis and IBM Cloud Activity Tracker, it will quickly become very important to isolate logs between different teams and design custom views to make it easier for users to search and find relevant log statements.

LogDNA provider for Terraform

Of course, you can use the IBM Cloud console to configure these views. This works well for one Log Analysis instance and a few views. As your project grows and more applications are deployed (or more teams working on the project), you may want to (or I should say you should) automate the configuration. Fortunately, IBM Log Analysis supports the LogDNA provider for Terraform for its configuration. 

As an example, let’s say I want to create a view in Log Analysis to only show the logs produced by IBM Cloud Kubernetes Service. Using the LogDNA provider for Terraform, it’s pretty straightforward. It would be similar to something like this:

provider "logdna" {
  servicekey = var.logging_api_key
  url        = var.logging_api_url
}

resource "logdna_view" "containers" {
 name     = "Compute ~ Containers"
 query    = "host:containers-kubernetes"
}

In this Terraform snippet:

  • logging_api_key is the service key to access the LogDNA API (this is different from the Ingestion API key). It can be obtained from the Service Credentials of the Log Analysis instance.
  • logging_api_url is the API endpoint. It is specific to the region where the Log Analysis instance was created (e.g, https://api.us-south.logging.cloud.ibm.com).

Sample set of custom views

But that’s just one view; what about other IBM Cloud services? To help in this matter, I compiled a list of simple views to populate a Log Analysis or Activity Tracker instance in a Terraform module, released to the Terraform Registry. The module can be found here.

The example shows how to use the module:

module "views" {
  source          = "we-work-in-the-cloud/logging-default-views/ibm"
  logging_api_key = ibm_resource_key.logdna_ingestion_key.credentials.service_key
  logging_api_url = "https://api.${var.region}.logging.cloud.ibm.com"
}

Once applied, a set of views defined here is created in your Log Analysis or Activity Tracker instance. The current set of views covers Compute, Databases, Security and VPC logs:

This module aims to be a starting point and an example of what is possible with the LogDNA provider. The provider has even more options to configure your Log Analysis instances. You may want to clone the repo and customize the views or simply be inspired and start capturing your existing view configuration in Terraform. By doing so, it becomes easier to replicate a configuration from one instance to the next — across different regions or accounts, for example.

Feedback, questions and suggestions

If you have feedback, suggestions or questions about this post, please reach out to me on Twitter (@L2FProd).

Was this article helpful?
YesNo

More from Cloud

Enhance your data security posture with a no-code approach to application-level encryption

4 min read - Data is the lifeblood of every organization. As your organization’s data footprint expands across the clouds and between your own business lines to drive value, it is essential to secure data at all stages of the cloud adoption and throughout the data lifecycle. While there are different mechanisms available to encrypt data throughout its lifecycle (in transit, at rest and in use), application-level encryption (ALE) provides an additional layer of protection by encrypting data at its source. ALE can enhance…

Attention new clients: exciting financial incentives for VMware Cloud Foundation on IBM Cloud

4 min read - New client specials: Get up to 50% off when you commit to a 1- or 3-year term contract on new VCF-as-a-Service offerings, plus an additional value of up to USD 200K in credits through 30 June 2025 when you migrate your VMware workloads to IBM Cloud®.1 Low starting prices: On-demand VCF-as-a-Service deployments begin under USD 200 per month.2 The IBM Cloud benefit: See the potential for a 201%3 return on investment (ROI) over 3 years with reduced downtime, cost and…

The history of the central processing unit (CPU)

10 min read - The central processing unit (CPU) is the computer’s brain. It handles the assignment and processing of tasks, in addition to functions that make a computer run. There’s no way to overstate the importance of the CPU to computing. Virtually all computer systems contain, at the least, some type of basic CPU. Regardless of whether they’re used in personal computers (PCs), laptops, tablets, smartphones or even in supercomputers whose output is so strong it must be measured in floating-point operations per…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters