Extracting logs from the EFK stack
IBM Security QRadar® Suite Software provides an action to extract and filter the logs from your cluster's EFK logging stack. The cluster logging components are based on Elasticsearch, Fluentd, and Kibana (EFK).
Before you begin
Install the command-line interface (CLI) utility cpctl from the cp-serviceability pod. For more information, see Installing the cpctl utility.
About this task
The extract_logs action queries and retrieves the cluster logs from your cluster's EFK logging stack.
The following table details the parameters that are available to run the command.
Parameter | Default | Description |
---|---|---|
domain | "" | Domain or IP of the Elasticsearch service. You can find the Elasticsearch IP and service by
running the following command:
|
port | 9200 | Port where the service is hosted. Usually, the port is 9200. You can find the Elasticsearch
port by running the following command:
|
token | "" | Token that the administrator generates by running the following command on the
system.
|
search | "" | Value that the search is to match; for example, Kubernetes. |
hours | 0 | Number of hours to search, based on the number of hours before the time when the search is run. |
days | 0 | Number of days to search, based on the number of days before the day when the search is run. |
min_date | "" | Minimum date to search, based on the date and time for the search to begin. The format must match yyyy-mm-dd HH:MM:SS, and the value must be a date that is before the date set for the max_date parameter. |
max_date | "" | Maximum date to search, based on the date and time for the search to end. The format must match yyyy-mm-dd HH:MM:SS, and the value must be a date that is after the date set for the min_date parameter. |
Procedure
Results
A report is sent to your console session. The retrieved logs are stored within the
cp-serviceability
pod.
Example
The following output is an example of the output that results from running the command cpctl diagnostics extract_logs with the --min_date and --max_date parameters set.
cpctl run extract_logs --domain <domain_name> --token "$(oc whoami -t)" --search kubernetes.container_name:clx-console --min_date '2020-10-20\ 00:00:00' and --max_date '2020-10-21\ 00:00:00'
Executing playbook extract_logs.yaml
- localhost on hosts: localhost -
Gathering Facts...
localhost ok
extract logs - filter by hours...
extract logs - filter by days...
extract logs - filter by min and max dates...
localhost done | stdout: To copy file to your local machine run:
oc cp cp-serviceability:/tmp/log_output.ndjson ./log_output.ndjson
- Play recap -
localhost : ok=2 changed=1 unreachable=0 failed=0 rescued=0 ignored=0
To copy the extracted logs in to your workstation, run the following commands:
POD=$(oc get pod --no-headers -lrun=cp-serviceability | cut -d' ' -f1)
rsync --rsh='oc rsh' -av -c --inplace --partial --append --progress $POD:/tmp/log_output.ndjson ./log_output.ndjson
The copy command is also shown in the console of the execution action.