Canceling a running Spark application

You can cancel a running Apache Spark application, for example if it was submitted by mistake or if it is stalled.

Canceling a Spark application using the Db2 Warehouse web console

To cancel an application:
  1. Open the Db2® Warehouse web console.
  2. Click Monitor > Workloads.
  3. Click the Spark tab. This page displays the user names of the clusters that you are authorized to monitor and the number of applications that are currently running in each cluster.
  4. Click a user name to open to the Spark monitoring page for the corresponding cluster.
  5. Click the kill link of the application that is to be canceled.

Canceling a Spark application using the spark-submit.sh script

Determine the submission ID of the application by using one of the methods described in Monitoring Spark applications. Then, use the spark-submit.sh script to issue a --kill command for the application.

For example, to cancel the application with the submission ID 20160615124014699000, issue the following command:
spark-submit.sh --kill 20160615124014699000

Canceling a Spark application using the IDAX.CANCEL_APP stored procedure

Determine the submission ID of the application by using one of the methods described in Monitoring Spark applications. Then, from within a database connection, issue a CALL statement that calls the IDAX.CANCEL_APP stored procedure.

Canceling a Spark application using the IBM Db2 Warehouse Analytics API

Use the IBM® Db2 Warehouse Analytics API to submit an HTTP POST request that calls the /dashdb-api/analytics/public/apps/cancel endpoint. For example, issue the following cURL command (replace the user ID, password, and host name):
curl -k --user "userid:password" 
  -X POST "https://hostname:8443/dashdb-api/analytics/public/apps/cancel?submissionid=20160615124014699000"