On Spark 1.6.2 (Scala 2.10.5) the following code worked just fine in the shell:
import org.apache.spark.mllib.linalg.Vector
case class DataPoint(vid: String, label: Double, features: Vector)
The mllib Vector overshadowed the Scala Vector correctly.
However, on Spark 2.0 (Scala 2.11.8) the same code throws the following error in the shell:
<console>:11: error: type Vector takes type parameters
case class DataPoint(vid: String, label: Double, features: Vector)
In order to make it work, I now have to name the class explicitly:
case class DataPoint(vid: String, label: Double,
features: org.apache.spark.mllib.linalg.Vector)
Can someone please tell me what changed, and is Spark or Scala at fault here? Thanks!