Datalink (Classic) REST Service Connector Guide

Applies to: Datalink (Classic) 4.0 and later

Use the REST (Representational State Transfer) Service Connector to extract data from a RESTful web-services enabled application, then load the data into your Apptio instance.

  1. Configure the connector

    Configure the REST Service connector and Apptio destination settings.

    Learn more about configuring a new connector

  2. Configure connector email preferences

    Specify your notification preferences when the AWS Connector succeeds or fails.

    Learn more about email preferences

  3. Schedule the connector

    Enable the connector to run at specific times.

    Learn more scheduling the connector

  4. Test the connector

    Test the connector to check for errors.

    Learn more about testing the connector

  5. Save the connector configuration

    Save the configuration and check for missing information.

    Learn more about saving the connector

  6. Execute or cancel the connector run

    Run the connector, cancel a running connector and view execution history.

    Learn more about connector runs

  7. Import Azure usage-based marketplace charges

    Import a report into your Apptio instance.

    Learn more about importing reports using the REST Service connector

  8. Troubleshooting

    Fix certificate validation errors during a connector run.

    Learn more about troubleshooting

Configure the connector

  1. Open the Datalink (Classic) Agent, then click New Connector.
  2. Click the REST Service Connector.
  3. Enter a Connector Name .
  4. Optionally, select Add this connector to a connector group. For information on connector groups, see Group multiple connectors.

Define the query information

Enter the following REST query information:

  • REST URL - The URL for your REST instance. You can replace the date element of a URL with a file name pattern (see Use variables to reference dates ). For example, https://company.restservice.com/ {CYYY}{MM} /data .
  • Requires Authentication - Select this check box if your REST instance requires authentication, then select an Authentication Type . If you select Custom Auth , note that REST headers are name-value pairs. For example, Accept|Charset . If you select the OAuth2.0 Auth option, see Use OAuth 2.0 authentication for detailed information.
  • Has Headers - Select this check box if your REST instance includes headers, then include the header information. REST headers are name-value pairs. For example, ACCEPT|application/xml for XML responses or ACCEPT|application/json for JSON responses.
  • Has Query Arguments - Select this check box if your REST instance includes query arguments (also known as parameters), then include the query operators. For example, option=Unix . Query arguments are name-value pairs. You can scope the values for query arguments using {MM} for the month (01-12), and {CYYY} for the four-digit calendar year (see following section, "Use variables to reference dates"). They will be resolved based on the value selected for the Time Period .
    NOTE : Date variables can only be used once in a query. Any subsequent occurrences are placed into the query without substitution, which will likely cause an error. If you need to make multiple references to date variables, set a SQL variable in your query similar to SET @Datestring = {CYYY}{MM} , then reference this variable in place of {CYYY}{MM} in your query.
  • Time Period - The time period used for date-based queries.
  • JSON Passthrough - Select this check box to use JSON passthrough instead of XML conversion. When this option is selected, JSON data is uploaded to Apptio as-is instead of first being converted to XML. This option requires the Apptio instance to be version 12.2.4 or later.

Use variables to reference dates

Use variables to reference date elements that change on a regular basis. The following date variables are supported:

  • {M} - One-digit month (1, 2).
  • {MM} - Two-digit month (01, 02).
  • {MMM} - Three-digit month abbreviation (Jan, Feb).
  • {CY} - Two-digit calendar year. This assumes a "20" prefix.
  • {CYYY} - Four-digit calendar year (2012, 2013, 2014).

Use OAuth 2.0 authentication

With OAuth 2.0 authentication, Datalink (Classic) will attempt to get an access token using the values of the OAuth 2.0 input fields. The Access token is then encrypted, stored and used while executing the connector.

  1. Register Datalink (Classic) as the client application with your OAuth 2.0 source to get Client_ID and Client_Secret. If prompted for a redirect URL during registration, use your Datalink (Classic) URL (for example, https://dl-products-manager.apptio.com:443).
  2. Enter the information to retrieve the OAuth 2.0 access tokens:
    • Authorization_Url - Enter the URL used for OAuth 2.0 authorization.
    • Token_Url - Enter the URL used for retrieving and refreshing access tokens.
    • Client_ID - Enter the Client_ID for the registered Datalink (Classic) application.
    • Client_Secret - Enter the Client_Secret value for the registered Datalink (Classic) application.
    • Grant_Type - Select an OAuth 2.0 grant type to retrieve access tokens:
      • token - Use the implicit grant mechanism.
      • code - Use the authorization code grant mechanism.
      • client_credentials - Use the client credentials grant mechanism.
    • Scope - Optionally, assign a scope to limit the amount of access granted to an access token. For example, an access token issued to a client app may be granted READ and WRITE access or just READ access.
    • Optionally, select Add Optional Authorization Parameters to enter additional key-value parameters to the access tokens.
  3. Click Get OAuth2.0 Tokens . You are redirected to the OAuth 2.0 authorization access token web address, granted access, then returned to the connector configuration page. The access tokens are saved as credentials for the data source and used to retrieve the data from the specified Rest_URL.

TIPS

  • Datalink (Classic) will attempt to automatically refresh an expired OAuth2.0 access token.
  • To manually refresh the access token, edit the connector, click Get OAuth2.0 Tokens , then save the connector.

Configure the Apptio destination settings

Link your connector to your Apptio destination. From within the connector, click Apptio Destination on the left menu to jump to these settings. If the connector is a part of a connector group, the Apptio Destination settings for Apptio Instance , Domain , Project , and Branch are disabled and instead inherited from the group settings.

Apptio Version - Select your destination Apptio version:

  • R11 :
    • Host name - Enter the Apptio server URL. Select Use SSL to encrypt the data using SSL (Secure Sockets Layer).
    • Username - Enter the user name that will be used to log into the Apptio instance (for example, JoeUser@apptio.com).
    • Password - Click Edit Password to enter or edit the password associated with the username.
  • R12 with Frontdoor - Select the Apptio Instance from the list of valid Access Administration applications.

Domain - Enter the domain name of the destination Apptio instance.

Project - Enter the name of the destination project.

Table - Enter the name of the destination table. This is the name of the table only, not the URL or path information. If the table does not exist, it will be created.
NOTE : Some connectors do not display this option when the target table is predefined as part of an application.

Source System - Enter the name of the source system for the data.

Time Period - Select the time period to load. Select the current or previous month based on the date the load occurs, or select a specific month and year.
NOTE : The Offset from Current Month option loads to a period that is offset from the source month. A positive offset indicates the number of future periods, and a negative offset indicates the number of past periods.

Validation - Select this check box to validate the uploaded data using the existing dataset. If the columns do not match, the uploaded data will be held in a staging area but will not be added to the dataset. An email is sent to Datalink (Classic) Admin users notifying them that the upload did not pass validation. In this case, the Admin can go to the TBM Studio Data tab and manually approve the upload. It is a best practice to validate the data before you upload it. Learn more about Email Notification Workflow for Validation Failure .

Disable Logins - Select this check box to disable log in to the destination Apptio instance. This option is available only when the destination Apptio version is R11.

Advanced Options - Click to set the following advanced options:

  • Data Encoding - Select the character encoding to use with queries. This option is available only when the destination Apptio instance is R12.
  • Category - Enter a category for the table. This option is available only when the destination Apptio instance is R12.
  • Branch - If using the branching feature in TBM Studio R12, enter a branch name to upload data into that branch. For more information, see Branch projects .
    • This feature requires R12.5 or later. If using an earlier version of R12, the setting is ignored, and the data is loaded into the trunk.
    • If the branch name is misspelled or no longer exists (for example, the branch is closed, and a subsequent build completes), an error is reported.
  • Transformation - Select from the following:
    • Append - Add the data to the end of the destination table.
    • Overwrite - Replace the existing data in the table with the new data.

Configure connector email notifications

After you configure email notifications for your Datalink (Classic) Agent See Configure agent email notifications Configure agent email notifications (see - link requires TBM Connect credentials) , you can configure email notifications for each connector. You can specify that an email notification be sent when a connector succeeds or fails. Click Notify or Notify & Archive in the left menu to jump to the email configuration settings; otherwise, scroll down to the Notify or Notify & Archive section.

  1. Select your email preferences:
    • Send an email notification when this connector succeeds .
    • Send an email notification when this connector fails .
  2. In the To  box, enter one or more email addresses, separated by a comma or semicolon. Enter a subject in the Subject box.

Schedule the connector

Schedule the connector to run at specific times.

Set up and enable the schedule

  1. From within the connector, click Schedule on the left menu to jump to these settings.
  2. Select Enable Schedule .
  3. Enter your schedule options:
    • Frequency sets the schedule options.
    • Run dates are in your selected time zone and, upon save, are displayed in the Next Run column in the connector list. If you do not select a time zone, UTC is used.

Test the connector

Click Test in the upper-right corner to test the connector. If there is an error, it will be flagged and you can then jump directly to that part of the connector definition. Clicking Test saves any changes made to the connector.

Save the connector configuration

Click Save in the upper-right corner to save the connector. If there is information missing from configuration, it will be flagged for you.

Execute or cancel the connector run

  • To run the connector, open the Datalink (Classic) Agent, click Actions next to the connector, then click Run Now .
  • To cancel a currently running connector, open the Datalink (Classic) Agent, click Actions next to the connector, then click Cancel Run .

View execution history

View a report of execution results that lists the sources, destination, target period, execution start and end time, an explanation of the results, and the number of uploaded bytes for each run of the connector.

  1. Open the Datalink (Classic) Agent, click Actions next to the connector, then click View Execution History .
  2. By default, the Execution History displays run information for the previous three months. To view up to 12 months of history, change the values in the From and Until fields, then click Get Execution Records .

Import Azure usage-based marketplace charges

Enterprise Azure customers have access to the reporting API, which allows access to reports. Use the REST Service Connector (see Configure the connector ) to import the Azure billing reports into your Apptio instance. For example, if Marketplace Store Charge has a report that is not accessible using the Azure Connector, you can use the REST Service Connector to import that report into your Apptio instance.

SEE ALSO: Azure API help on microsoft.com .

Import a report into your Apptio instance

  1. Obtain Authorization key for Azure API Azure Billing Enterprise APIs | Microsoft Docs .
  2. Create a new REST Service Connector in the Datalink (Classic) default agent (see Configure the connector ).
  3. In the Connector field REST URL, enter the URL with your enrollment number.
  4. Replace the time period with {CYYY}{MM}. For example: https://consumption.azure.com/v2/enrollments/ <azure enrollment number> /billingPeriods/ {CYYY}{MM} /marketplacecharges .
  5. Create an Authorization Key, enter the key value of your API key, then finish configuring the Connector.

Query details

The time period selection in the Connector currently allows only Previous and Current month. A new Connector with Query parameters can be set up to access historical Azure billing data. In this case, the configuration is similar to the previous one, except:

  • The REST URL is different.
  • The Query arguments specify a specific time range.
    NOTE: Year and month parameters like {CYYY} and {MM} cannot be used.

Troubleshooting

Troubleshoot certificate validation error received during connector run

Fix the validation issue if an on-premises agent connector encounters the following error:

javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target, the Datalink (Classic) service was unable to establish root certificate authority (CA) trust for that certificate from the java key store

  1. Download (or export to a file) the certificate being served.
  2. Navigate to your Datalink (Classic) installation folder.
  3. Run the following command to import the certificate file to dlagent.jks:

    keytool -import -alias <pickAnUniqueAliasNameForCertificate> -keystore <pathTo_dlagent.jks_fileWhereCertNeedsToBeImported> -file <cerficateFileToBeImported.crt> .

  4. Verify the certificate is added to the keystore by running the following command:

    keytool -list -v -keystore <pathTo_dlagent.jks_file> -alias <exampleAlias> .

  5. Restart the Datalink (Classic) Agent.
Note:

When you update your on-premises agent, the new version will overwrite the dlagent.jks file. To avoid this error, you must create a backup of this file and replace any new version with the backup after install.