I am at the beginner stage of learning spark. I have Just started coding using python using pyspark.while going through basic code I got this error on Jupyter notebook. Well I have installed spark on my PC which in working condition. My problem is when I enter "pyspark" on my Ubuntu terminal it directly goes to webUI of jupyter. It doesn't go in Interactive shell. I dont know why?
2nd when I run following code I got error ..
from pyspark import SparkContext, SparkConf
conf = SparkConf().setAppName('appName').setMaster('local')
sc = SparkContext(conf=conf)
data = range(10)
dist_data = sc.parallelize(data)
print(dist_data.reduce(lambda a, b: a+b))
error of above code is...
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by at /home/trojan/.local/lib/python3.6/site-packages/IPython/utils/py3compat.py:186
What does that mean?? Please tell me what could be the error! sorry for error image I couldn't paste it clearly so I pasted screen shot of error Hope it work!