Starting the Spark cluster

About this task

If you have completed all of the tasks to enable client authentication for Apache Spark and you have previously completed all of the tasks in your overall configuration of z/OS® Spark, you can start your Spark cluster as usual.

However, if you have just configured client authentication as part of your initial overall configuration of z/OS Spark, skip this procedure and do not start the Spark cluster until you have completed all of the remaining configuration tasks for z/OS Spark, as directed in "What to do next."

Procedure

After completing all of the tasks to enable client authentication and configure z/OS Spark, start the Spark cluster and run your Spark applications as usual. If a worker or driver is unable to be authenticated, it fails to connect to the master port.

With z/OS Spark client authentication enabled, an application that is submitted to the master port has its executors started under the user ID of that application. An application that is submitted to the REST port, which is the port for cluster deploy mode, is considered part of the Spark cluster and therefore has both the driver and executors run under the user ID of the Spark cluster.

What to do next

If you have not yet completed your initial configuration of z/OS Spark, continue your configuration activities with Configuring IBM Java.