Managing Analytics Engine Powered by Apache Spark instances
On the details page of an instance, you can view information related to a Spark instance, manage user access to the instance or delete the instance. A user with Administrator or Developer role can view instance details.
To manage a service instance for Analytics Engine Powered by Apache Spark:
-
From the Navigation menu on the IBM Cloud Pak for Data web user interface, click Services > Instances, find the instance and click it to view the instance details. These include:
- The storage claim name
- The endpoint to start and stop history server
- The url to the Spark History server user interface
- Spark kernel endpoint
- Spark job v4 endpoint
- Spark job v3 endpoint (deprecated)
- Spark job v2 endpoint (deprecated and will be removed in Cloud Pak for Data 4.7.0)
-
If
spec.serviceConfig.sparkAdvEnabled
is enabled in the Analytics Engine custom resource (CR), you will see:- The name of the deployment space
- The deployment space ID
-
From the options menu on the right side of the window, you can:
-
Manage access: Only a user with Administrator role can manage user access to the Analytics Engine Powered by Apache Spark instances. From here, an administrator can grant users Developer role to the instance so that they can submit Spark jobs. See Managing user access.
-
Delete: Only a user with Administrator role can delete an Analytics Engine Powered by Apache Spark instance.
Important: If
spec.serviceConfig.sparkAdvEnabled
is set to true in the custom resource (CR), you must delete the deployment space that is associated with the instance if you want to create an instance again with the same name. Note that when you delete the deployment space, you will also delete all assets and jobs in that space.
To delete a deployment space: 1. From the Navigation menu on the Cloud Pak for Data web user interface, click **Deployments**. 2. On the Spaces tab, search for the space named `_space`. From the Actions menu on the right, select **Delete**. If you can't delete the deployment space, check to see if any jobs are stuck in Starting state. See Troubleshooting for Analytics Engine Powered by Apache Spark for how to remove jobs stuck in Starting state.
Note: The data files in the instance user's
home
directory, which is created at the time the Analytics Engine Powered by Apache Spark instance is provisioned, are not deleted when the instance is deleted. You must delete this data yourself.
-
Generating an access token
All users must generate their own access token to use the Spark jobs API. For instructions on how to generate an access token, see Generating an authorization token or API key.
What to do next
Parent topic: Administering Analytics Engine Powered by Apache Spark