Configuring Logstash as an event source
When Monitoring is deployed in an Monitoring environment, You can forward log data to Monitoring from Logstash. The Logstash integration is only available in Monitoring, Advanced.
Before you begin
By default, the IBM Cloud Private installer deploys an Elasticsearch, Logstash and Kibana (ELK) stack to collect system logs for the IBM Cloud Private managed services, including Kubernetes and Docker. For more information, see IBM Cloud Private logging.
Note: Ensure you meet the prerequisites for IBM Cloud Private, such as installing and configuring the kubectl, the Kubernetes command line tool.
About this task-
The log data collected and stored by Logstash for your IBM Cloud Private environment can be configured to be forwarded to Monitoring as event information and then correlated into incidents.
Procedure
- Go to Administer > Monitoring > Integrations on the IBM Cloud Pak console.
- Click Configure an integration.
- Go to the Logstash tile and click Configure.
- Enter a name for the integration and click Copy to add the generated webhook URL to the clipboard. Ensure you save the generated webhook to make it available later in the configuration process. For example, you can save it to a file.
- Click Save.
-
Modify the default Logstash configuration in IBM Cloud Private to add Monitoring as a receiver. To do this, edit the Logstash pipeline ConfigMap to add the webhook URL in the output section as follows:
-
Load the ConfigMap into a file using the following command:
kubectl get configmaps logstash-pipeline --namespace=kube-system -o yaml > logstash-pipeline.yaml
Note: The default Logstash deployment ConfigMap name in IBM Cloud Private is
logstash-pipeline
in thekube-system
namespace. If your IBM Cloud Private logging uses a different Logstash deployment, modify the ConfigMap name and namespace as required for that deployment. -
Edit the
logstash-pipeline.yaml
file and add an HTTP section to specify Monitoring as a destination using the generated webhook URL. Paste the webhook URL into the url field:output { elasticsearch { index => "logstash-%{+YYYY.MM.dd}" hosts => "elasticsearch:9200" } http { url => "<Cloud_Event_Management_webhook_URL>" format => "json" http_method => "post" pool_max_per_route => "5" } }
Note: The
pool_max_per_route
value is set to 5 by default. It limits the number of concurrent connections to Monitoring to avoid data overload from Logstash. You can modify this setting as required. -
Save the file, and replace the ConfigMap using the following command:
kubectl --namespace kube-system replace -f logstash-pipeline.yaml
-
Check the update is complete at
https://<icp_master_ip_address>:8443/console/configuration/configmaps/kube-system/logstash-pipeline
. Note, it can take up to a minute for the configuration changes to take affect.
-
-
To start receiving log data from Logstash, ensure that Enable event management from this source is set to On in Monitoring.