0

when i try to run spark-shell i am getting the following error:

/root/apache-spark/spark-2.3.0-bin-without-hadoop/bin/spark-class: line 71: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.51-1.b16.el7_1.x86_64/bin/java: No such file or directory

I have exported the java path: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64/jre/bin/java

The above error is regarding java classpath. I need to export java classpath also. I want to know what is the default classpath for java openjdk version "1.8.0_232" on centos 7.

Priyanka
  • 101
  • 1
  • 9

2 Answers2

1

First, you can type java-version on the command line to see the JDK version of the current system. Second, you can type witch Java to see the JDK path. If it is not your JDK, you need to change the JDK environment variable.

Ensure that javahome= "/usr/lib/ JVM /java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64/bin" in the configuration file "spark-env.conf"

There seems to be a problem with your Java path.I hope I can help you.

keven
  • 46
  • 1
0

You can use below command to find java home on centOS:

dirname $(dirname $(readlink -f $(which javac)))

Also I think you haven't set JAVA_HOME properly. Because spark is using some other java path than what you have specified. You can try setting JAVA_HOME in spark-env.sh Refer Environment Variables section in doc

wypul
  • 807
  • 6
  • 9
  • The output of update-alternatives --config java is Selection Command ----------------------------------------------- *+ 1 java-1.8.0-openjdk.x86_64 (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64/jre/bin/java) so i have done the following export JAVA_HOME="/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64/jre/bin/java". Please correct me If i am wrong And could you please tell me what path has to export as classpath. – Priyanka Dec 03 '19 at 09:15
  • You should set `/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64` as `JAVA_HOME`. And you don't have to pass Java classpath separately. – wypul Dec 03 '19 at 09:25
  • Thank you. I set JAVA_HOME as /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64 And tried to run spark-shell again i am getting an error Error: A JNI error has occurred, please check your installation and try again Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) – Priyanka Dec 04 '19 at 03:32
  • @Priyanka https://stackoverflow.com/questions/42307471/spark-without-hadoop-failed-to-launch – wypul Dec 04 '19 at 04:54
  • @Priyanka can you please accept the answer as it answered your original question and if it worked for you. – wypul Dec 05 '19 at 07:18