I have a Spark cluster install on an Ubuntu server with 3 worker nodes(using docker compose).
You can see all cluster information. Also it's accessible(web ui) from my pc as shown below:
Now I want to add a job from java driver, so it would be something like this:
SparkSession spark = SparkSession.builder()
.master("spark://nodemaster:7077")
.appName("MongoSparkConnectorIntro")
.config("spark.network.timeout",2000000)
.config("spark.driver.port","32772")
.config("spark.driver.host","172.31.64.69")
.config("spark.driver.bindAddress","172.18.0.2")
.config("spark.mongodb.input.uri", "mongodb://")
.config("spark.mongodb.output.uri", "mongodb://")
.getOrCreate();
org.apache.spark.SparkContext: Running Spark version 2.4.1
org.apache.spark.SparkContext: Submitted application: MongoSparkConnectorIntro
org.apache.spark.SecurityManager: Changing view acls to: $USER
org.apache.spark.SecurityManager: Changing modify acls to: $USER
org.apache.spark.SecurityManager: Changing view acls groups to:
org.apache.spark.SecurityManager: Changing modify acls groups to:
org.apache.spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set($USER); groups with view permissions: Set(); users with modify permissions: Set($USER); groups with modify permissions: Set()
org.apache.spark.util.Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
I'm using Intellij IDE on windows 10. I also added my IP to /etc/hosts
Any helps?