I'm fairly new to the Scala environment. I am receiving a deduplicate
error while trying to assemble a Scala Spark job with the DataStax connector. I'd appreciate any advice as to what could resolve this issue.
My System:
- Latest Scala (2.11.7) installed via brew
- Latest Spark (2.10.5) installed via brew
- Latest SBT (0.13.9) installed via brew
- SBT Assembly plugin installed
My build.sbt
:
name := "spark-test"
version := "0.0.1"
scalaVersion := "2.11.7"
// additional libraries
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0" % "provided"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.5.0-M3"
Console:
$ sbt assembly
...
[error] 353 errors were encountered during merge
java.lang.RuntimeException: deduplicate: different file contents found in the following:
/Users/bob/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:META-INF/io.netty.versions.properties
...