5

Running basic df.show() post spark notebook installation

I am getting the following error when running scala - spark code on spark-notebook. Any idea when this occurs and how to avoid?

[org.apache.spark.repl.ExecutorClassLoader] Failed to check existence of class org.apache.spark.sql.catalyst.expressions.Object on REPL class server at spark://192.168.10.194:50935/classes
[org.apache.spark.util.Utils] Aborting task
[org.apache.spark.repl.ExecutorClassLoader] Failed to check existence of class org on REPL class server at spark://192.168.10.194:50935/classes
[org.apache.spark.util.Utils] Aborting task
[org.apache.spark.repl.ExecutorClassLoader] Failed to check existence of class
Leothorn
  • 1,345
  • 1
  • 23
  • 45

1 Answers1

3

I installed the spark on local, and when I was using following code it was giving me the same error.

spark.read.format("json").load("Downloads/test.json")

I think the issue was, it was trying to find some master node and taking some random or default IP. I specified the mode and then provided the IP as 127.0.0.1 and it resolved my issue.

Solution

Run the spark using local master

usr/local/bin/spark-shell --master "local[4]" --conf spark.driver.host=127.0.0.1'
Gaurang Shah
  • 11,764
  • 9
  • 74
  • 137