3

I have a Spark project which I usually package with sbt-assembly. All spark dependencies are marked provided and not included in my fat jar. I want to have another command to build a really fat jar with all dependencies, spark included. I am trying the following with no luck:

lazy val standalone = project
.dependsOn(mainProj % "compile->compile;test->test;provided->compile")
.settings(
  logLevel in assembly := Level.Debug,
  assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = true, includeDependency = true)
)

Note, the answers to How to add "provided" dependencies back to run/test tasks' classpath? explain how to add provided dependencies to the runtime classpath, however my question is about how to have them end up in the packaged artifact after executing sbt assembly.

Mario Galic
  • 47,285
  • 6
  • 56
  • 98
Kal-ko
  • 317
  • 1
  • 10

1 Answers1

1

To build a truly fat jar which packages everything including provided dependencies, we could redefine fullClasspath in assembly like so

assembly / fullClasspath := (Compile / fullClasspath).value

If we put this in a separate command like so

commands += Command.command("assemblyTrulyFatJar") { state =>
  """set assembly / fullClasspath := (Compile / fullClasspath).value""" :: "assembly" :: state
}

then executing sbt assemblyTrulyFatJar should package everything, while sbt assembly keeps its default behaviour.

Mario Galic
  • 47,285
  • 6
  • 56
  • 98
  • I tried this solution, but it only allows me to run the app in the IDE, which is not what I want. Instead I want to be able to have a command similar to "sbt assembly" which will package everything, including provided deps – Kal-ko Mar 20 '19 at 13:24
  • Apologies for misunderstanding the question. Please see edited answer which consists of defining `assembly / fullClasspath := (Compile / fullClasspath).value`. I will also ask the moderators to hopefully get the `duplicated` mark removed. – Mario Galic Mar 20 '19 at 22:02