2

Hi my question refers to an old thread : Multi-version build with SBT

This is exactly how the library dependencies are mentioned in my project as well

libraryDependencies <++= (dispatchVersion) { (dv) => Seq(
  "net.databinder.dispatch" %% "dispatch-core" % dv,
  "net.databinder.dispatch" %% "dispatch-json4s-native" % dv
)}

but we have upgraded to sbt 1.0 which doesnt support this sort of library add. I've tried to import the scala file that contains the variable for eg. "dispatchVersion" which is of type settingKey[Map[symbol, String]] consists of all the latest version numbers similar to what you've mentioned.

How do i migrate the libraryDependencies similar to something mentioned above, as per the sbt version 1.0.0 syntax? The error that i'm getting is as below:

 error: No implicit for Append.Values[Seq[sbt.librarymanagement.ModuleID], sbt.Def.Initialize[Seq[sbt.librarymanagement.ModuleID]]] found,
      so sbt.Def.Initialize[Seq[sbt.librarymanagement.ModuleID]] cannot be appended to Seq[sbt.librarymanagement.ModuleID]
    libraryDependencies ++= dispatchVersion { v => Seq(
Sunil
  • 72
  • 7

1 Answers1

1
libraryDependencies ++= Seq(
  "net.databinder.dispatch" %% "dispatch-core"          % dispatchVersion.value,
  "net.databinder.dispatch" %% "dispatch-json4s-native" % dispatchVersion.value,
)
Dale Wijnand
  • 6,054
  • 5
  • 28
  • 55
  • Thanks @Dale, the dispatchversion was just an example. Actually i want to import values for spark-core, spark-avro,spark-sql in the libraryDependencies, the version numbers of which are defined in the variable dispatchVersion(of type settingKey[Map[symbol,String]]), for example like, dispatchVersion :=Map('sparkCore->"2.0.0",'sparkAvro->"1.8" ) so on and so forth. So what i'm trying to achieve is dispatchVersion.sparkCore, dispatchVersion.sparksql in the build.sbt. How do i do that? I hope the explanation was not confusing – Sunil Jan 22 '18 at 14:04
  • `dispatchVersion.value('sparkCore)`, `dispatchVersion.value('sparkAvro)`. – Dale Wijnand Jan 22 '18 at 18:15
  • I tried this yesterday and it worked, thought i'd update it today, but thanks a lot @Dale for the answer this is what i was looking for. – Sunil Jan 23 '18 at 09:51