Prerequisites for loading data from domain-specific log data sources

If you configure a log connection to load data from domain-specific log data sources, such as IBM MQ or WebSphere, you must meet specific prerequisites that are different from other log systems.

Domain-specific logs must be sourced from a log management system. You must create an incoming log data connection to collect this data. The type of connection to create depends on the following factors:

  • Whether you prefer to provide the data in pull or push mode
  • Which log management system stores the domain-specific log data

Modes

Consider the following issues when you are deciding whether to provide the data in pull or push mode.

  • Pull

    • Positives:
      • If your log management system has a REST API then all you must do is to create the data connection and Cloud Pak for Watson AIOps does the rest.
      • You can also use the Cloud Pak for Watson AIOps built-in connectors, which can cover commonly used options.
    • Negatives: you have no control over data load. When Cloud Pak for Watson AIOps initiates the pull operation, the operation might create excessive load on your log management systems.
  • Push

    • Positives: you have full control over data load.
    • Negatives: you must create code to push your log data to the Kafka topic.

Pull mode

For log data from systems that are not Falcon Logscale, LogDNA, or Splunk, you can make use of either the Custom, ELK, or Kafka data connection to ingest your log data.

Pull mode connections
Domain-specific log management system Data connection Link
Falcon LogScale Falcon LogScale Falcon LogScale connections
LogDNA LogDNA LogDNA connections
Splunk Splunk Splunk connections
Log management system that uses an ELK stack ELK Custom connections
Any other log management system Custom Custom connections

Push mode

For log data from systems that are not Falcon Logscale, LogDNA, or Splunk, you can use either the Custom, ELK, or Kafka data connection to ingest your log data.

Push mode connections
Domain-specific log management system Data connection Link
Any system Kafka Custom connections

Other requirements

Log entries must be in JSON format, and not in plain text.

When you are preparing domain-specific log data for loading, ensure that the ibm_messageId and loglevel fields are at the top level. The module field is optional. These fields are used by the statistical baseline log anomaly detection algorithm to perform statistical analysis and detect log anomalies.