2

I am trying to install spark in a new Macbook. I couldn't run the spark-shell and get the following error:

Failed to initialize compiler: object java.lang.Object in compiler mirror not     found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Exception in thread "main" java.lang.NullPointerException
at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request.headerPreamble(IMain.scala:895)
at scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:918)
at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1337)
at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1336)
at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64)
at scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1336)
at scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:908)
at scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:1002)
at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:997)
at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:579)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:68)
at org.apache.spark.repl.Main$.main(Main.scala:51)
at org.apache.spark.repl.Main.main(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
zero323
  • 322,348
  • 103
  • 959
  • 935

3 Answers3

6

First, install Java 8 (You can keep Java 9 if that's what you have).

Then, in your .bash_profile, set JAVA_HOME as follows:

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)  
export PATH=$JAVA_HOME/bin:$PATH

Finally, add this:

export SPARK_LOCAL_IP="127.0.0.1"

Hope this helps. There is a nice trick on how to alternate between different Java versions here:
Mac OS X and multiple Java versions Check out the answer by @Vegard.

This is my setting:

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)  
export PATH=$JAVA_HOME/bin:$PATH

export SCALA_HOME=/path/to/your/scala  
export PATH=$PATH:$SCALA_HOME/bin

export SPARK_HOME=/path/to/your/spark  
export PATH="$SPARK_HOME/bin:$PATH"  
export SPARK_LOCAL_IP="127.0.0.1"
nv.snow
  • 864
  • 1
  • 14
  • 24
4

I get the same error when trying to run spark-shell with Java 9.

Please try installing Java 8 and running spark-shell with JAVA_HOME set to /Library/Java/JavaVirtualMachines/jdk1.8.0.jdk/Contents/Home

For more information about JDK 9 support in Scala you can look a scala-dev issue #139

Harald Gliebe
  • 7,236
  • 3
  • 33
  • 38
0

In my case it was brew command who changed JAVA_HOME just for spark-submit and other spark-related command.

Check with brew info apache-spark - see content of the formula. In mine there was fixed javasdk@11.

So I had to edit this ruby file to use current $JAVA_HOME. And brew reinstall apache-spark. After that the error gone and the spark-commands use the java version that I have chosen at the moment with jenv

ANDgineer
  • 505
  • 3
  • 12