I have install sparks for pyspark using the method mentioned in this link..
http://nishutayaltech.blogspot.in/2015/04/how-to-run-apache-spark-on-windows7-in.html
Now I am creating pyspark and trying to use "sc" variable.But I am getting below error.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
I tried to below variables
from pyspark import SparkContext
SparkContext.setSystemProperty('spark.executor.memory', '2g')
sc = SparkContext("local", "App Name")
the error i am getting is:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "D:\BIGDATA\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", line 115, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "D:\BIGDATA\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", line 272, in _ensure_initialized
callsite.function, callsite.file, callsite.linenum))
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at D:\BIGDATA\spark-2.1.0-bin-hadoop2.7\bin\..\python\pyspark\shell.py:43