Deleting Spark application code
You can delete Apache Spark files that you no longer need.
For example:
- To delete the file subdir1/myapp.jar from your
$HOME/spark/apps directory, issue the following
command:
spark-submit.sh --delete-file apps subdir1/myapp.jar
- To delete the file mylib.jar from your
$HOME/spark/defaultlibs directory, issue the following
command:
spark-submit.sh --delete-file defaultlibs mylib.jar
- To delete the file genlib.jar from the /globallibs
directory, issue the following
command:
spark-submit.sh --delete-file globallibs genlib.jar
Using a REST API call
Alternatively, use the IBM® Db2® Warehouse API to submit an HTTP DELETE request that calls
the /home endpoint. For example, to delete the file
idax_examples.jar, issue the following cURL command (replace the user ID,
password, and host
name):
curl -k --user "userid:password"
-X DELETE "https://hostname:8443/dashdb-api/home/spark/apps/idax_examples.jar"
If the $HOME/spark/apps folder is empty, you can delete it. For example,
issue the following cURL command (replace the user ID, password, and host
name):
curl -k --user "userid:password"
-X DELETE "https://hostname:8443/dashdb-api/home/spark/apps"
Only
the last directory (/apps) is deleted; the /spark directory is not deleted. The
$HOME/spark/apps directory is re-created automatically when you deploy new
Spark application code as described in Deploying Spark application code and dependencies to Db2 Warehouse.