Customizing the Apache Spark directory structure
IBM® Z Platform for Apache
Spark installs Apache Spark into a z/OS® file system (zFS) or hierarchical file system (HFS) directory. This
documentation refers to the installation directory as SPARK_HOME. The default
installation directory is /usr/lpp/IBM/zspark/spark/sparknnn, where
nnn is the current Apache Spark version (for instance, /usr/lpp/IBM/zspark/spark/spark32x
for Spark
3.2.0
or /usr/lpp/IBM/zspark/spark/spark35x
for Spark 3.5.0).
By default, Apache Spark runs from the installation directory, and most of its configuration files, log files, and working information are stored in the installation directory structure. On z/OS systems, however, the use of the installation directory for all of these purposes is not ideal operating behavior. Therefore, by default, Apache Spark is installed in a read-only file system. The following tasks describe how to set up customized directories for the Apache Spark configuration files, log files, and temporary work files. While you can customize the directory structure used by Apache Spark, the examples here follow the Filesystem Hierarchy Standard.
Plan to work with your system programmer who has authority to update system directories.
export SPARK_HOME=/usr/lpp/IBM/zspark/spark/spark35x