Customizing z/OS IzODA Livy
Before you can use z/OS IzODA Livy, you must customize your environment for it and its dependent products.
Configuring the z/OS IzODA Livy working directories
Complete this task to configure z/OS IzODA Livy to run using your desired configurations. The following table lists some of the working directories that z/OS IzODA Livy uses:
Directory contents | Default location | Environment variable | Suggested new directory |
---|---|---|---|
Log files | $LIVY_HOME/logs | $LIVY_LOG_DIR | Under /var, such as /var/livy/logs |
Configuration files | $LIVY_HOME/conf | $LIVY_CONF_DIR | Under /var, such as /var/livy/conf |
PID files | /tmp | $LIVY_PID_DIR | Under /tmp, such as /tmp/livy/pid |
- Follow your file system conventions and create new working directories for z/OS IzODA Livy.
- Ensure the Livy server user ID (created in z/OS IzODA Livy Installation and Customization) has read/write access to the newly created working directories.
- Ensure that these directories, specifically $LIVY_LOG_DIR, get cleaned regularly.
Creating the Livy configuration files from template
This task creates the livy.conf
, livy-env.sh
and
log4j.properties
configuration files in the Livy server user ID’s
$LIVY_CONF_DIR
directory.
cp $LIVY_HOME/conf/livy-env.sh.template $LIVY_CONF_DIR/livy-env.sh
cp $LIVY_HOME/conf/livy.conf.template $LIVY_CONF_DIR/livy.conf
cp $LIVY_HOME/conf/log4j.properties.template $LIVY_CONF_DIR/log4j.properties
Updating the Livy configuration files from template
You will need to configure the Livy server by using the livy.conf
and
livy-env.sh
configuration files located in $LIVY_CONF_DIR
directory.
- The following parameters should be set in the
livy.conf
configuration file:- Set
livy.spark.master
to the Master URL of your Spark cluster. For example:livy.spark.master = spark://master:7077
- Set
livy.spark.deploy-mode
to the deploy mode that your Spark cluster accepts.Note: When setting thelivy.spark.deploy-mode
to cluster, Livy session mode submissions are no longer available. In addition, batch job Livy submits with PySpark applications are not supported. Both of these are available only if thelivy.spark.deploy-mode
configuration is set toclient
. See Table 2 for more information.Table 2. Supported environments per deploy mode Client Cluster Spark session (Scala) Supported Not supported Sparky Session (Python) Supported Not supported Spark batch job Supported Supported Sparky batch job Supported Not supported livy.spark.deploy-mode = client
- Set
livy.file.local-dir-whitelist
to the directories that contain your Spark application files, separated by commas:livy.file.local-dir-whitelist = /usr/lpp/IBM/izoda/spark/spark23x/examples/jars/, path/to/directory2containing/jar-files
- You may also modify the host IP address and the Livy server port by setting
livy.server.host
andlivy.server.port
:livy.server.host = localhost livy.server.port = 8998
If you have changed the
SPARK_LOCAL_IP
environment variable, you will likely have to change thelivy.server.host
as well. By default, the Livy server uses port 8998. - You may also modify the session timeout value by setting the livy.server.session.timeout
configuration. We support the following suffixes: “us” (microseconds), “ms” (milliseconds), “s”
(seconds), “m” or “min” (minutes), “h” (hours), and “d”
(days):
livy.server.session.timeout-check = true livy.server.session.timeout = 1h
If livy.server.session.timeout-check is set to false, livy.server.session.timeout will be ignored.
- Set
- Update the
$LIVY_CONF_DIR/livy-env.sh
with the environment variables pointing to the newly created working directories during the Configuring the z/OS IzODA Livy working directories task.
LIVY_LOG_DIR=/var/livy/logs
LIVY_PID_DIR=/tmp/livy/pid