How to use App Connect with Databricks
Databricks is a cloud-based platform for big data and AI. It combines Apache Spark with Azure services to simplify data engineering, analytics, and machine learning.
Supported product and API versions
To find out which product and API versions this connector supports, see Detailed System Requirements on the IBM Support page.
Connecting to Databricks
Complete the connection fields that you see in the App Connect Designer page (previously the Catalog page) or flow editor. If necessary, work with your Databricks administrator to obtain these values.
- Account name is a meaningful name that helps you to identify your account.
To obtain the connection values for Databricks, see Obtaining connection values for Databricks.
To connect to a Databricks endpoint from the App Connect Designer Applications and APIs page for the first time, expand Databricks, then click Connect. For more information, see Managing accounts.
Before you use the account that is created in App Connect in a flow, rename the account to something meaningful that helps you to identify it. To rename the account on the Applications and APIs page, select the account, open its options menu (⋮), then click Rename Account.
General Considerations
Before you use App Connect Designer with Databricks, take note of the following considerations:
- You can see lists of the trigger events and actions that are available
on the Applications and APIs page of the App Connect Designer.
For some applications, the events and actions depend on the environment and whether the connector supports configurable events and dynamic discovery of actions. If the application supports configurable events, you see a Show more configurable events link under the events list. If the application supports dynamic discovery of actions, you see a Show more link under the actions list.
- If you are using multiple accounts for an application, the set of fields that is displayed when you select an action for that application can vary for different accounts. In the flow editor, some applications always provide a curated set of static fields for an action. Other applications use dynamic discovery to retrieve the set of fields that are configured on the instance that you are connected to. For example, if you have two accounts for two instances of an application, the first account might use settings that are ready for immediate use. However, the second account might be configured with extra custom fields.
Events and Actions
Databricks events
These events are for changes in this application that trigger a flow to start completing the actions in the flow.
Show more configurable events: Events that are shown by default are pre-configured by using optimized connectivity. More items are available after you configure events that can trigger a flow by polling this application for new or updated objects.
Databricks actions
Your flow completes these actions on this application.
| Object | Action | Description | Connector API documentation link |
|---|---|---|---|
| Files | Delete file | Deletes a file | https://docs.databricks.com/api/workspace/files |
| Download file | Downloads a file | ||
| Upload file | Uploads a file | ||
| Volumes | Retrieve volumes | Retrieves volumes from the schema | https://docs.databricks.com/api/workspace/volumes/list |
More items are available after you connect App Connect to Databricks