0

I added this to <my_project_name>/project/plugins.sbt:

resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")

in order to import sbt-spark-packages, but sbt tell me "Extracting structure failed: Build status: Error".

I tried with other plugin but the behavior is always the same.

sbt version: 1.8.2

scala version: 2.13.10

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
AT181903
  • 11
  • 4

1 Answers1

2

See the ticket

dl.bintray.com/spark-packages/maven is forbidden https://github.com/databricks/sbt-spark-package/issues/50

Bintray has been sunset.

Bintray is deprecated.

https://spark-packages.org/package/databricks/sbt-spark-package

This package doesn't have any releases published in the Spark Packages repo, or with maven coordinates supplied. You may have to build this package from source, or it may simply be a script.

Do

git clone https://github.com/databricks/sbt-spark-package.git
cd sbt-spark-package
git reset --hard v0.2.6
sbt package

Now you can find a JAR at sbt-spark-package/target/scala-2.10/sbt-0.13/sbt-spark-package-0.2.6.jar.

Do sbt publishLocal and it will be published at ~/.ivy2/local/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.6/jars/sbt-spark-package.jar.

Now you can use this sbt plugin in your project:

build.sbt

lazy val root = (project in file("."))
  .settings(
    name := "scalademo",
    scalaVersion := "2.11.12"
  )

project/build.properties

sbt.version = 0.13.18

project/plugins.sbt

addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")

Please notice that sbt-spark-package is a plugin to sbt 0.13.x, not sbt 1.x

Support SBT 1.x https://github.com/databricks/sbt-spark-package/issues/40

In order to use the plugin with sbt 1.8.2 and Scala 2.13.10 you'll have to upgrade it yourself.

Moreover, sbt-spark-package seems to be outdated, abandoned, deprecated

java.lang.NoSuchMethodError: sbt.UpdateConfiguration.copy$default$1()Lscala/Option https://github.com/databricks/sbt-spark-package/issues/51

Is this plugin deprecated? https://github.com/databricks/sbt-spark-package/issues/48

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
  • Done, but there is always the error "Extracting structure failed: Build status: Error" – AT181903 Mar 21 '23 at 11:03
  • @AT181903 Did you do `sbt publishLocal` (for `sbt-spark-package`, not for your project)? Can you see the JAR at `~/.ivy2/local/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.6/jars/sbt-spark-package.jar`? If you still experience issues please show your full `build.sbt`, `project/build.properties`, `project/plugins.sbt` etc. and full sbt output. Why do you need `sbt-spark-package`? It's outdated. – Dmytro Mitin Mar 21 '23 at 14:46
  • 1
    I needed it in order to use this: https://github.com/saurfang/spark-knn But now I tried to import all the code in my project and it works correctly. I didn't use sbt-spark-package – AT181903 Mar 25 '23 at 16:55