Run a Spark shell on LSF
Use the bsub -I command to run the Spark shell as an interactive job.
Procedure
-
Use the bsub -I command to submit the lsf-spark-shell.sh
connector script to LSF as an
interactive job.
bsub -I bsub_options lsf-spark-shell.sh
For example,
bsub -I -m "hostA! others" -R "span[ptile=4]" -n 8 lsf-spark-shell.sh
This command launches the Spark shell, which you can use to specify the Spark environment and run Spark commands.
-
To view the job in the Spark GUI, navigate to the URL of the first execution host using a web
browser.
http://first_execution_hostname:port_number
where first_execution_hostname is the host name of the first execution host and port_number is the port number set using the SPARK_MASTER_PORT environment variable in the lsf-spark-shell.sh connector script file.
For example, http://hostA:7077