Creating a custom data source type

Create a custom data source type to include the new custom field in addition to the existing default fields.

About this task

In Operations Analytics - Log Analysis a data source is an entity that enables Operations Analytics - Log Analysis to ingest data from a specific source. In order for Operations Analytics - Log Analysis to ingest data from Netcool®/OMNIbus, a data source is required.

A data source type is a template for a data source, and lists out the event fields to send to Operations Analytics - Log Analysis, together with relevant control parameters for each field. You can have multiple data source types, each set up with a different set of event fields; the advantage of this is that you can easily change the events in the data source using a predefined data source type.

The default datasource type is called OMNIbus1100, and this datasource type contains the default set of events that are sent to Operations Analytics - Log Analysis.

Procedure

  1. Log into the Operations Analytics - Log Analysis server and open a terminal there.
  2. Unzip the contents of Tivoli® Netcool/OMNIbus Insight® Pack 1.3.1 to a local directory.
    This procedure assumes that the archive has been unzipped to the following location:
    /home/user/OMNIbusInsightPack_v1.3.1
  3. Go to the docs sub-directory within the location to which you unzipped the file.
    cd /home/user/OMNIbusINsightPack_v1.3.1/docs
  4. Edit the omnibus1100_template.properties file using a text editor of your choice, for example, vi:
    vi omnibus1100_template.properties
    The omnibus1100_template.properties file contains index definitions corresponding to one or more fields to be sent using the data source. For the Netcool/OMNIbus data source all of the event fields must be indexed, so the omnibus1100_template.properties file contains an index entry for each event field to be sent to Operations Analytics - Log Analysis.
  5. Add the new custom field to the end of the omnibus1100_template.properties file.
    The following code snippet shows the beginning of the file and the end of the file.
    # Properties file controlling data source specification
    # Add new fields at the end of the file 
    # 'moduleName' specifies the name of the data source. 
    # Update the version number if you have created a data source with the same name previously and want 
    # to upgrade it to add an additional field. 
    
    <existing index definitions, one for each of the default event fields> 
    
    # ----------------------------------------------------------------------------------------------
    # Insert new fields after this point. 
    # Number each field sequentially, starting with 'field19'. 
    # See the IBM Smart Cloud Log Analytics documentation for the DSV Toolkit for an explanation 
    # of the field values. 
    # ------------------------------------------------------------------------------------------------
    #[field19_indexConfig] 
    #name: <INDEX NAME> 
    #dataType: TEXT 
    #retrievable: true 
    #retrieveByDefault: true 
    #sortable: false 
    #filterable: true
    The end of the file includes a commented out section which you can uncomment to add the new field. In that section, replace the name: attribute with the name of the field that you are adding.
    Here is what the end of file looks like when an index has been added for new custom field TicketNumber:
    # ----------------------------------------------------------------------------------------------
    # Insert new fields after this point. 
    # Number each field sequentially, starting with 'field19'. 
    # See the IBM Smart Cloud Log Analytics documentation for the DSV Toolkit for an explanation 
    # of the field values. 
    # ------------------------------------------------------------------------------------------------
    [field19_indexConfig] 
    name: Ticket_Number 
    dataType: TEXT 
    retrievable: true 
    retrieveByDefault: true 
    sortable: false 
    filterable: true
    Note: The order of indexes is important; it must match the order of values specified in the Gateway for Message Bus mapping file. This mapping file will be modified later in the procedure.
    For more information on the other attributes of the index, see the following Operations Analytics - Log Analysis topics:
  6. Optional: Change the name of the new custom data source type that you are about to create. Within the [DSV] section of the omnibus1100_template.properties file find the attribute specification moduleName and change the value specified there.
    By default moduleName is set to CloneOMNIbus. You can change this to a more meaningful name; for example, customOMNIbus.
  7. Save the omnibus1100_template.properties file and exit the file editor.
  8. From within the /home/user/OMNIbusINsightPack_v1.3.1/docs directory, run the addIndex.sh script to create the new data source type.
    addIndex.sh -i
  9. Check that the data source type was created and installed onto the Operations Analytics - Log Analysis server by running the following command:
    $UNITY_HOME/utilities/pkg_mgmt.sh -list
    Where $UNITY_HOME is the Operations Analytics - Log Analysis home directory; for example, /home/scala/IBM/LogAnalysis/.

Results

The following two artifacts are also created. Store them in a safe place and make a note of the directory where you stored them, as you might need them later:
Insight pack image archive
By default this archive is called CloneOMNIbusInsightPack_v1.3.1.0.zip. If you followed the suggested example in this procedure, then this archive will be called customOMNIbusInsightPack_v1.3.1.0.zip. This archive contains the new custom data source type; you need a copy of this image if you ever want to delete it from the system in the future. The archive is located in the following directory:
/home/user/OMNIbusINsightPack_v1.3.1/dist
Template properties file
This is the omnibus1100_template.properties file that you edited during this procedure. Keep a copy of this file in case you want to modify the data source type settings at a later time.