I am building spark by using sbt. When I run the following command:
sbt/sbt assembly
it takes some time to build spark. There are several warning that appear and at the end I am getting following errors:
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.
When I check sbt version using command sbt sbtVersion, I get following result:
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info] 0.13.7
[info] repl/*:sbtVersion
[info] 0.13.7
[info] spark/*:sbtVersion
[info] 0.13.7
When I give command, ./bin/spark-shell, I get following output:
ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.
What can the solution be?