3

I am running pyspark but it can be unstable at times. There are couple of times it crashes at this command

spark_conf = SparkConf()

with the following error message

     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/pyspark/conf.py", line 106, in __init__
self._jconf = _jvm.SparkConf(loadDefaults)
     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 772, in __getattr__
raise Py4JError('{0} does not exist in the JVM'.format(name))
     Py4JError: SparkConf does not exist in the JVM

Any idea what is the problem? Thank you for your help!

Michael
  • 1,398
  • 5
  • 24
  • 40

1 Answers1

1

SparkConf does not exist in the pyspark context, try:

from pyspark import SparkConf

in the pyspark console or code.

Shawn Guo
  • 3,169
  • 3
  • 21
  • 28