Creating the Apache Spark configuration directory

Complete this task to create a customized directory for the Apache Spark configuration files.

About this task

The default Apache Spark configuration directory is $SPARK_HOME/conf. In accordance with the Filesystem Hierarchy Standard (FHS), this task creates a new configuration directory under /etc. This task also ensures that the user ID that will run Apache Spark programs has read/write access to the new directory and sets the SPARK_CONF_DIR environment variable to point to the new directory.

Procedure

  1. Open an SSH or Telnet shell environment and create a new directory under /etc for the Apache Spark configuration files.
    For example, to create the /etc/spark/conf directory, enter the following command:
    mkdir -p /etc/spark/conf
  2. Provide read/write access to the new directory to the user ID that runs Open Data Analytics for z/OS®.
  3. Ensure that the SPARK_CONF_DIR environment variable points to the new directory.
    For example:
    export SPARK_CONF_DIR=/etc/spark/conf
    Note: The SPARK_CONF_DIR environment variable can be set and exported as in this example in the $HOME/.profile or the /etc/profile as determined in Step 2 in Setting up a user ID for use with z/OS Spark.

Results

You now have a customized directory to hold the Apache Spark configuration files.

What to do next

Continue with Updating the Apache Spark configuration files.