I want to deploy Spark Job Server (in a Docker container) on a different host to the Spark Master. However the server_start.sh script seems to assume that it is being run on the same machine as the Spark Master. E.g.:
if [ -z "$SPARK_CONF_DIR" ]; then
SPARK_CONF_DIR=$SPARK_HOME/conf
fi
# Pull in other env vars in spark config, such as MESOS_NATIVE_LIBRARY
. $SPARK_CONF_DIR/spark-env.sh
Under the Architecture section it says:
The job server is intended to be run as one or more independent processes, separate from the Spark cluster (though it very well may be colocated with say the Master).
Does anyone know how the server_start.sh
script can be made to work as-is with a Spark Master hosted on a different machine to Spark Job Server?