1

Trying to follow the spark tutorial but get the following error -

https://spark.apache.org/docs/latest/quick-start.html

"name 'spark' is not defined"

Using Python version 2.6.6 (r266:84292, Nov 22 2013 12:16:22)
SparkContext available as sc.
>>> import pyspark
>>> textFile = spark.read.text("README.md")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'spark' is not defined

This is how I start it -

./bin/pyspark --master local[*]
Alper t. Turker
  • 34,230
  • 9
  • 83
  • 115
user1050619
  • 19,822
  • 85
  • 237
  • 413

1 Answers1

3

If your spark version is 1.0.1 you should not use the tutorial for version 2.2.0. There are major changes between these versions.

On this website you can find the Tutorial for 1.6.0 .

Following the 1.6.0 tutorial you have to use textFile = sc.textFile("README.md") instead of textFile = spark.read.text("README.md").

Timo Strotmann
  • 371
  • 2
  • 14