Launching a Spark application

After your Apache Spark application code has been deployed to your $HOME/spark/apps directory and tested, you can launch the Spark applications that it contains either manually or by means of a program call. Launching a Spark application creates a Spark cluster for the user or calling program (if one does not already exist) and runs the application in that cluster.

Note: The current version of Db2® Warehouse uses Spark Release 2.3.0. Earlier versions of Db2 Warehouse used earlier releases of Spark. An application that was written for use with one of these earlier Spark releases might not run correctly with the current Spark release. For information about the differences between Spark releases and possible rework requirements, refer to the Spark online documentation. Each Spark release has its own documentation page with sections that describe removals, behavior changes, and deprecations relative to the previous release.
Note: To be able to run application code written in R, Db2 Warehouse requires the RJSONIO package. If this package has not already been installed in your R environment, ask your Db2 Warehouse administrator to issue the following command from within the interactive R shell:
install.packages('RJSONIO')
To launch a Spark application, use any of the methods described in Testing a Spark application:

After you launch an application, note the submission ID that is returned, because you will need it later to locate the corresponding log files.

If your application has dependencies to auxiliary libraries, you will need to satisfy them as described in Managing dependencies.