1

I am getting started with Apache Spark (on Windows 10), and things are not running as expected. Using IntelliJ IDEA Ultimate 2019.1, everything shows clean (no red underlines) but when I try to create a new SparkContext using

val sc = new SparkContext("local[*]", "RatingsCounter")

I get an error

Exception in thread "main" java.lang.NoSuchMethodError: \
scala.Predef$.refArrayOps([Ljava/lang/Object;)\
Lscala/collection/mutable/ArrayOps;
    at org.apache.spark.util.Utils$
.stringToSeq(Utils.scala:2664)

Versions are as follows:
Scala 2.12.8
Java 1.8.0_171
Spark 2.4.1

Googling shows some responses, but none seem to reflect my issue.

Vinny Gray
  • 477
  • 5
  • 18

0 Answers0