2

I download the spart-2.2, but when I run it. it's throw an error (info: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.internal.config.package).Not is Hadoop lib reason.

My OS is Linux, using JDK8, and follows the logs:

17/08/07 06:13:29 WARN SparkContext: Another SparkContext

is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at: org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) java.lang.reflect.Constructor.newInstance(Constructor.java:423) py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) py4j.Gateway.invoke(Gateway.java:236) py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) py4j.GatewayConnection.run(GatewayConnection.java:214) java.lang.Thread.run(Thread.java:748) Traceback (most recent call last): File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/shell.py", line 54, in spark = SparkSession.builder.getOrCreate() File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/sql/session.py", line 169, in getOrCreate sc = SparkContext.getOrCreate(sparkConf) File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py", line 334, in getOrCreate SparkContext(conf=conf or SparkConf()) File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py", line 118, in init conf, jsc, profiler_cls) File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py", line 180, in _do_init self._jsc = jsc or self._initialize_context(self._conf._jconf) File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/pyspark/context.py", line 273, in _initialize_context return self._jvm.JavaSparkContext(jconf) File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in call File "/usr/home/soft/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. : java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.internal.config.package$ at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546) at org.apache.spark.SparkContext.(SparkContext.scala:373) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:236) at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:748)

Mr.Huang
  • 21
  • 3

0 Answers0