Hive via Execution Engine for Hadoop connection

You can create a connection asset for Hive via Execution Engine for Hadoop.

Use the Hive via Execution Engine for Hadoop connection to connect to tables in a Hive warehouse on the Hadoop cluster.

Prerequisites

Supported encryption

Credentials

Platform login credentials

Create a Hive via Execution Engine for Hadoop connection to the Hive warehouse on the Hadoop cluster

  1. Click Add to project > Connection.
  2. Select Hive via Execution Engine for Hadoop.
  3. Enter a name and description and the connection information.
  4. Select your platform login credentials.
    Note: For other users to use the connection, they would need to supply their own Cloud Pak for Data credentials.
  5. In the Jar uris drop-down list, upload the HiveJDBC41.jar file if it is not already there, and then select it.
  6. In the SSL Certificate field, enter the SSL certificate from the connection URL. If the Hive server is SSL-enabled, enter the certificate for the server as well.
    Example with two certificates:
    -----BEGIN CERTIFICATE-----
    certificate from the connection URL
    -----END CERTIFICATE-----
    -----BEGIN CERTIFICATE-----
    certificate from the Hive server
    -----END CERTIFICATE-----
    
  7. Enter the URL for accessing the Hadoop Integration Service.
    Important: The Hadoop Integration Service URL must be the same as the URL in the Hadoop Registration Details. The administrator can confirm the URL from Administration > Platform configuration > System integration.
  8. Click Create.

Next step: Add data assets from the connection

Where you can use this connection

You can use a Hive via Execution Engine for Hadoop connection in the following workspaces and tools:

Analytics projects

Catalogs

Restrictions

Known issues

Troubleshooting Hadoop environments

Parent topic: Supported connections