Cluster-level logging and IBM Log Analysis with Log DNA
The collection, management, and analysis of application and system logs through a centralized logging system is a key requirement when you design a cloud application. If you are an experienced Kubernetes user, new to Kubernetes, or looking for a first class logging solution for the IBM Cloud Kubernetes Service platform, you’re going to be excited about the latest IBM Cloud service that has been announced: IBM Log Analysis with LogDNA. You can use this service to configure cluster-level logging for the Kubernetes platform, and you will also benefit from advanced features like managing and monitoring logs in the cloud. Use this service to simplify and enhance the work of developers and DevOps users in your organization.
Why adopt cluster-level logging?
From the moment you provision a Kubernetes cluster in the IBM Cloud, you want to know what is happening inside the cluster. You’ll need to access logs to troubleshoot problems and pre-empt issues, and you’ll want to have access to all different types of logs, such as worker logs, pod logs, application logs, or network logs. In addition, you’ll want to monitor different sources of log data in your Kubernetes cluster. Therefore, your ability to manage and access log records from any of these sources is critical. Your success with managing and monitoring logs depends on how you configure the logging capabilities for your Kubernetes platform.
For example, if you decide to output log records of a containerized application to stdout and stderr, you could access log data using the kubectl command. However, would you be able to access log data if the container crashes?
You could also write application logs to a file and then forward the logs somewhere else; but in this scenario, have you designed a log rotation strategy so that those logs do not consume all the storage on a node? Kubernetes does not handle log rotation for you.
In addition to where you write logs, what happens when a pod is evicted? Or if a node crashes or is deleted? What can you do to address some of these scenarios?
You can now configure cluster-level logging for your Kubernetes cluster. Particularly, you can configure a logging system that has a separate life-cycle from the Kubernetes platform. As a result, you are able to access system and application logs at any time.
What does a cluster-level logging configuration translate to in the Cloud?
You must be able to store log data, system logs, and containerized application logs in separate storage from Kubernetes system components.
You must configure every node (worker) in a cluster with a logging agent. Specifically, this agent collects and forwards logs to an external logging backend.
You must be able to centralize log data for analysis on an external logging backend.
Why IBM’s cluster-level logging solution?
IBM’s solution offers you the following capabilities:
Remove dependencies on resources. IBM’s cluster-level logging solution does not depend on fluentd or require sidecar containers. This sets it apart from other solutions—like logging using Stackdriver or logging with Elasticsearch and Kibana—that rely on fluentd.
Configure cluster-level logging easily and quickly. You can deploy a daemon set that automatically configures all nodes in the cluster with the logging agent—this agent is a pod that forwards logs to a centralized logging system. With two commands, you have your cluster configured.
Keep your pod specifications without having to make changes to them. Log data is automatically collected by the logging agent.
Collect and aggregate system and application log data from stdout and stderr and files on selected paths to a centralized logging system where you can manage it and analyze them.
Host your log data on IBM Cloud.
Leverage the IBM Log Analysis with LogDNA service to collect and aggregate additional logs from other sources outside the Kuberentes Service.
Configuring cluster-level logging on the IBM Cloud for Kubernetes clusters
You can implement cluster-level logging for your Kubernetes clusters on IBM Cloud by following these steps:
Provision an instance of the IBM Log Analysis with LogDNA service to configure a centralized log-management system. Log data is hosted on IBM Cloud.
Provision your clusters on the IBM Cloud Kubernetes Service. Kubernetes v1.9+ clusters are supported.
Configure the LogDNA agent on every worker (node) in a cluster.
You can configure the LogDNA agent in a cluster very quickly. Simply run two commands:
With the first command, you add a secret to your cluster. This secret contains the LogDNA ingestion key. The LogDNA ingestion key is used to authenticate the logging agent with the IBM Log Analysis with LogDNA service. It is used to open a secure web socket to the ingestion server on the logging backend system.
With the second command, you configure the LogDNA agent that is responsible for collecting and forwarding your logs.
Through these commands, you install the logdna-agent into every worker in your cluster. The agent collects automatically logs with extension *.log and extensionless files that are located under /var/log. By default, logs are collected from all namespaces, including the kube-system.
Finally, notice that the IBM Log Analysis with LogDNA service tails for new log data on existing log files or in files in the specified directories.
You can use the IBM Log Analysis with LogDNA service to add enhanced logging capabilities to the IBM Cloud. For example, you could use the service to configure a central-logging solution for your Kubernetes clusters on the IBM Cloud where log data is hosted on IBM Cloud.
You can also use IBM Log Analysis with LogDNA to easily collect and aggregate logs in a centralized system. In addition, you can use it to monitor and analyze logs by customizing dashboards and defining alerts.
You will find the IBM Log Analysis with LogDNA service under the Developer Tools section in the IBM Cloud catalog once the service is available. While you wait for the service to be available, take a moment to explore the IBM Cloud catalog and the IBM Cloud Kubernetes Service.
Stay tuned for follow-up blog posts providing more details on how to use the IBM Log Analysis with LogDNA service with different services and compute resources. In the meantime, check out another blog on this new service: Increase Your Observability in the Cloud: IBM Log Analysis with LogDNA.