Creating a manual data definition

You must create manual data definitions for real-time data sources and for all the data sources other than Cloud APM, Monitoring, ITM, or Cloud Event Management.

Before you begin

About this task

Any Dashboard Designer user can create manual data definitions.

After you log in to Dashboard Designer, you can create data definitions by using any of the following options on the landing page:

  • Click CREATE COMPONENTS, and in the Create Components page, click DATA DEFINITION.
  • Click the Expand Expand icon to open the navigation pane of Dashboard Designer, and click Data Definition > Custom > Create New Data Definition.

Procedure

Complete the following steps to create manual data definitions:

  1. In the navigation pane of Dashboard Designer, click Data Definition > Custom > Create New Data Definition.

    The New Data Definition page opens.

  2. Click the Edit Edit icon that is displayed next to the New Data Definition field, and enter a name for the manual data definition.
  3. From the Connector Type list, select a connector type.
  4. From the Connector Source Name list, select a source name, and enter information that is mentioned in the following fields based on the type of connector that you select:
    • For EXCEL, JDBC, CassandraDB, MongoDB, and HiveDB connector types, in the Query field, enter a query according to your data source schema.
      Note: To enable multiple sections filter option for JDBC widget or dashboard, the data definition query must retrieve distinct values only. For more information about multiple selections, see Creating a direct or independent filter.
    • For RESTAPI, ElasticSearch, Google Sheet, Google BigQuery, QRadar, Nagios XI, ServiceNow, CEM, SolarWinds, OMNIbus, Druid, and Prometheus connector types, from the Method list, select a method and complete any of the following steps, based on the method that you select:
      • For GET method, in the URI field, enter the uniform resource identified (URI) for the source.
        Note:
        • For ElasticSearch, Nagios XI, ServiceNow, Google Sheet, CEM, Prometheus, and QRadar Connectors, you can select GET method only.
        • For Prometheus, the URI must contain the parameter name and the start and end time in Coordinated Universal Time (UTC) format or Epoch format.

          For more information about URI formats, see https://prometheus.io/docs/prometheus/2.10/querying/api/.

        • For CEM, enter the URI in an encoded format.

          For example, the following URI is an encoded URI:/api/incidentquery/v1?incident_filter=priority%20%3E%3D%201

          To generate an encoded URL, see https://www.urlencoder.org/

        • For Google Sheet, enter the URI in an encoded format.

          For example, select query select A, sum(B) group by A must be encoded as select%20A%2C%20sum(B)%20group%20by%20A

          To generate an encoded URL, see https://developers.google.com/chart/interactive/docs/querylanguage

        • For REST API, GET method only, URI encoding is added for special characters, such as <, >, :, ., =, _, and -. The URI can contain any of the mentioned special characters. The URI encoding is not supported for + character. If the URI contains + character, you must replace the + character with a space, and then reuse the URI.
      • For POST method, in the URI field, enter the URI for the source, and in the Request Body field, enter the post request.
        Note: For Druid Connector, you can select POST method only.
      • Under Custom Headers, complete the following steps:
        • In the Name field, enter the request header name that is provided by the REST API service provider.
        • In the Value field, enter the request header value that is provided by the REST API service provider.
    • For Real Time Connector, enter information in the following fields:
      • In the Kafka Server IP or Host Name field, enter the IP address or hostname of the Kafka server that you want to connect to.
        Note: Authentication must not be enabled on the Kafka server.
      • In the Kafka Port Number field, enter the port number for Kafka server.
      • In the Topic Name field, enter the topic name that you want to subscribe to.
    • For HBaseDB connector type, enter information in the following fields:
      • In the Table Name field, enter a database table name from which you want to retrieve data.
        Note: If you provide the table name only, then the entire data of that table is retrieved. To retrieve selective data, you must specify Start Rowkey End Rowkey, and either Columns or Filter Expression.
      • In the Start Rowkey field, enter the row key value from which you want to retrieve data.
      • In the End Rowkey field, enter the row key value up to which you want to retrieve the data.
        Note: Data is retrieved for all the row key values that lie between the start and the end row keys, including the start row key data. The end row key data is not retrieved.
      • In the Columns field, enter column families or columns in any of the following formats, based on your requirement:
        • "<colFamily1>,<colFamily2>": To retrieve data for all the columns within the specified column families.

          Where, <colFamily1> is a column family in the HBase database table.

        • "<colFamily1>:<colQualifier_A>","<colFamily2>:<colQualifier_B>": To retrieve data for specific columns within the column families.

          Where, <colFamily1> is a column family and <colQualifier_A> is a column within the column family.

        For example, if an HBase database table contains five column families, <colFamily1>, <colFamily2> and so on up to <colFamily5>, and each column family contains three columns, such as <colname_A>, <colname_B>, and <colname_C>, then you can enter any of the following values:
        • To retrieve data for all the columns in the column families 1 and 4, enter "<colFamily1>,<colFamily4>".
        • To retrieve data for columns A and C in the column families 1 and 4, enter, "<colFamily1>:<colQualifier_A>","<colFamily1>:<colQualifier_C>","<colFamily4>:<colQualifier_A>","<colFamily4>:<colQualifier_C>".
        Note: Enter either Columns or Filter Expression.
      • In the Filter Expressions field, enter any filter expressions.

        For more information, see https://www.cloudera.com/documentation/enterprise/5-5-x/topics/admin_hbase_filtering.html

    Note: If you want to set master-listener or drill-down relation between widgets or dashboards that contain manual data definition, then ensure that the parameter name and the common metric value that you plan to specify for the master widget are already specified in the saved custom data definition of the listener or drill-down widget.

    For more information, see Setting relations.

  5. Optional: If you want to filter data by using filters, then you must set and add cases to the query or the REST API methods.

    Complete the following steps to set default case and add cases:

    1. In the Default Case pane, in the Attribute field, enter a filter name or attribute name, and in the Value field, enter a display name.

      You can click Add Attribute and repeat this step to add multiple attributes and display values.

    2. To add a conditional filter, click Add Case, and then set the required filter conditions by entering values in the Attribute and Value fields.
      You can add and set multiple attributes and values for each case.
      Note: For OMNIbus connector, the Value field must not contain single or double quotation marks. Else, the filter does not work as expected.
  6. To save the data definition, click Save.
  7. In the Save Data Definition window, complete any of the following steps:
    1. In the Name field, enter a new name for the data definition. You can use alphanumeric characters and underscore in the name field.
    2. To save the data definition to an existing category, click Existing Category, select a category from the list, and click Save.
    3. To save the data definition to a new category, click New Category, and enter a name for the new category, and click Save.

    To save the manual data definition with another name, click the Save As option.

  8. To view and validate the response that is received from the manual data definition that you created, click the Preview icon.

    The response from the manual data definitions must be displayed in a tabular format. If the response is in an incorrect format, then it is not displayed in a tabular format.

    Note:
    • For HBase Connector, if you enter incorrect values in the Start Rowkey or End Rowkey fields, then the data is not retrieved and a message that indicates that the data cannot be found is displayed.
    • For Real Time Connector, the Preview icon is disabled and hence, you cannot verify the response that is received from a real-time custom data definition.

Results

The newly created manual custom data definition is listed under Most Recently Created Data Definitions in the navigation pane. The All Data Definitions page displays the following audit trail details for each custom data definition:
  • Custom data definition name
  • Custom data definition category
  • Date and time when the data definition was first created, and the username of the user who first created the custom data definition.
  • Date and time when the data definition was last modified, and the username of the user who last modified the data definition. Only the latest record is displayed.

Example

The following are the values that you can use to configure a Tivoli Netcool/OMNIbus custom data definition:
  • Endpoint URL as http://<web_service_name>/<port>
  • URI as /objectserver/restapi/alerts/status?collist={Col1},{Col2},ServerName,Serial
  • In the Default Case pane, you can enter the following values in the Attribute and Value fields:
    Table 1. Attribute and value
    Attribute Value
    Col1 Node
    Col2 Identifier

What to do next

You can use the saved custom data definitions in widgets or dashboards. For more information, see Creating custom widgets or Creating a dashboard.