3

I am building spark by using sbt. When I run the following command:

sbt/sbt assembly

it takes some time to build spark. There are several warning that appear and at the end I am getting following errors:

[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.

When I check sbt version using command sbt sbtVersion, I get following result:

[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn]  * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn]  * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info]  0.13.7
[info] repl/*:sbtVersion
[info]  0.13.7
[info] spark/*:sbtVersion
[info]  0.13.7

When I give command, ./bin/spark-shell, I get following output:

ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.

What can the solution be?

Neel Shah
  • 319
  • 7
  • 16

2 Answers2

13

You must configure SBT heap size:

  • on linux type export SBT_OPTS="-Xmx2G" to set it temporary
  • on linux you can edit ~/.bash_profile and add line export SBT_OPTS="-Xmx2G"
  • on windows type set JAVA_OPTS=-Xmx2G to set it temporary
  • on windows you can edit sbt\conf\sbtconfig.txt and set -Xmx2G

More info:

http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html

How to set heap size for sbt?

mgosk
  • 1,874
  • 14
  • 23
  • When I do this (in windows) i get: ignoring option MaxPermSize=256m; support was removed in 8.0. Not sure what to do now? – cs0815 Feb 06 '17 at 11:40
  • PermSize is a different memory area. You can ignore this message, it is only a warning. – mgosk Jan 04 '21 at 11:20
0

This is probably not a common resolution, but in my case I had to run this command to resolve the OutOfMemoryError when building a spark project with sbt (path is specific to mac OS):

rm -rf /Users/markus.braasch/Library/Caches/Coursier/v1/https/

Increasing memory settings for a variety of arguments in SBT_OPTS did not solve it.

markus
  • 2,955
  • 2
  • 20
  • 13