Updating data source or storage credentials

To update the data source credentials, use one of the following methods:

watsonx.data Developer edition

watsonx.data on Red Hat® OpenShift®

Procedure

  1. Update the data source credentials in list view.
    1. In Data sources tab, click the overflow menu and then click Update credentials.
    2. In the Update credentials window, enter your data source username and your data source password.
    3. Click Update.
  2. Update the data source credentials in topology view.
    1. Hover over the data source for which you want to update the credentials.
    2. Click the Update credentials icon.
    3. In the Update credentials window, enter your data source username and your data source password.
    4. Click Update.
    Behaviour:
    • Presto: Update credentials is disabled for storage buckets and data sources when associated with Presto. To update the credentials, you will need to disassociate the storage buckets and data sources, update the credentials and then associate it.
    • Spark: Update credentials is disabled for storage buckets and data sources when associated with Spark. To update the credentials, you will need to disassociate the storage buckets and data sources, update the credentials and then associate it.
    • Milvus: For external home buckets, Update credentials is enabled, but a manual restart of the Milvus engine is required for the changes to take effect.

      For IBM managed buckets, there is no option to update credentials.

    Related API: For information on related API, see