Kubernetes logging

The best practice for Kubernetes logging is to pipeline the log content to a centralized place that is easy to access, search, and manage retention. Most commercial Kubernetes platforms have a logging service such as ELK or EFK that collect all log data directed to console by default.

Keeping log files external to the Docker containers on volume does not work or is hard to track because different replicas of the same application can be up and down on different hosts or the same host. Often, this either ends up with a conflict (if you map to a local drive) or ends up with lots of log files for each container instance. It is hard to track those log files across host machines or on a network drive for searching and retention management.

For these reasons, it is a good idea to direct InfoSphere® MDM log files of interest to the Docker console. This enables the third party logging service to collect the logs for centralized storage, search, analysis, and retention management.

Viewing the Docker Console log

Use one of the following methods to view the WebSphere® Application Server logs on the Kubernetes cluster environment.
Using the kubectl command
  • If the pod hosts a single container, use the following command syntax:
    kubectl logs <pod-name> -n <namespace>
  • If the pod hosts multiple containers, use the following command syntax:
    kubectl logs <pod-name> <container-name> -n <namespace>
Using Docker commands
  1. Run the docker inspect command on the worker nodes to get the Docker file location.
    docker inspect --format='{<.LogPath>}’ <container_id>
  2. Browse to the file location and use a text editor to view the log file.

Piping logs to the Docker Console

To pipe logs to the Docker Console, append the following lines at the end of your startup scripts before creating your pods.

For the MDM operational server image
Edit the startup.sh file in the <MDM_HOME> directory and append the following lines:
PROFILE_NAME=AppSrv01

SERVER_NAME=server1

PID=$(ps -C java -o pid= | tr -d " ")

tail -F /opt/IBM/WebSphere/AppServer/profiles/$PROFILE_NAME/logs/$SERVER_NAME/SystemOut.log --pid $PID -n +0 &

tail -F /opt/IBM/WebSphere/AppServer/profiles/$PROFILE_NAME/logs/$SERVER_NAME/SystemErr.log --pid $PID -n +0 >&2 &

while [ -e "/proc/$PID" ]; do

    sleep 1

done
For the IBM® Stewardship Center for MDM image
Edit the Configure_ISC.sh file in the /tmp directory and append the following lines:
WAS_PROFILE_HOME=/opt/IBM/BPM/v8.6/profiles/NodeProfilePS

SERVER_NAME=SingleClusterMember1

PID=$(ps -C java -o pid= | tr -d " “)

tail -F $/WAS_PROFILE_HOME/logs/$SERVER_NAME/SystemOut.log --pid $PID -n +0 &

tail -F  $/WAS_PROFILE_HOME/logs/$SERVER_NAME/SystemErr.log --pid $PID -n +0 >&2 &

while [ -e "/proc/$PID" ]; do

    sleep 1
  
done

Example: Accessing the logs

Go to the node where the container is running and run the following commands to check the logs on console output.

root@node1:~# docker inspect --format='{{.LogPath}}’ 423020b44c7c
Output location:
/var/lib/docker/containers/423020b44c7c788db0dc7bdd97799059bfbab5b65f8b4605cace517735623645/423020b44c7c788db0dc7bdd97799059bfbab5b65f8b4605cace517735623645-json.log
Review and check the log content:
cat /var/lib/docker/containers/423020b44c7c788db0dc7bdd97799059bfbab5b65f8b4605cace517735623645/423020b44c7c788db0dc7bdd97799059bfbab5b65f8b4605cace517735623645-json.log

......

18-12-19T08:20:40.309521442Z"}
{"log":"ADMU0128I: Starting tool with the AppSrv01 profile\n","stream":"stdout","time":"2018-12-19T08:20:40.726765389Z"}
{"log":"ADMU3100I: Reading configuration for server: server1\n","stream":"stdout","time":"2018-12-19T08:20:40.727703342Z"}
{"log":"ADMU3201I: Server stop request issued. Waiting for stop status.\n","stream":"stdout","time":"2018-12-   19T08:20:43.523475999Z"}
{"log":"ADMU4000I: Server server1 stop completed.\n","stream":"stdout","time":"2018-12-19T08:20:54.138659675Z"}
{"log":"\n","stream":"stdout","time":"2018-12-19T08:20:54.140771822Z"}
{"log":"ADMU0116I: Tool information is being logged in file\n","stream":"stdout","time":"2018-12-19T08:20:56.589289061Z"}
{"log":"/opt/IBM/WebSphere/AppServer/profiles/AppSrv01/logs/server1/startServer.log\n","stream":"stdout","time":"2018-12-19T08:20:56.589323787Z"}
{"log":"ADMU0128I: Starting tool with the AppSrv01 profile\n","stream":"stdout","time":"2018-12-19T08:20:57.8524332Z"}
{"log":"ADMU3100I: Reading configuration for server: server1\n","stream":"stdout","time":"2018-12-19T08:20:57.855585484Z"}
{"log":"ADMU3200I: Server launched. Waiting for initialization status.\n","stream":"stdout","time":"2018-12-19T08:21:01.155541209Z"}
{"log":"ADMU3000I: Server server1 open for e-business; process id is 739\n","stream":"stdout","time":"2018-12-19T08:21:25.505406075Z"}