Questions tagged [spark-packages]

8 questions
10
votes
1 answer

Including a Spark Package JAR file in a SBT generated fat JAR

The spark-daria project is uploaded to Spark Packages and I'm accessing spark-daria code in another SBT project with the sbt-spark-package plugin. I can include spark-daria in the fat JAR file generated by sbt assembly with the following code in the…
Powers
  • 18,150
  • 10
  • 103
  • 108
5
votes
3 answers

After installing sparknlp, cannot import sparknlp

The following ran successfully on a Cloudera CDSW cluster gateway. import pyspark from pyspark.sql import SparkSession spark = (SparkSession .builder .config("spark.jars.packages","JohnSnowLabs:spark-nlp:1.2.3") …
2
votes
1 answer

What is the cause of LIBRARY_MANAGEMENT_FAILED while trying to run notebook with custom library on synapse?

Today when we've tried running our notebooks defined in synapse, we've received constantly error: 'LIBRARY_MANAGEMENT_FAILED'. We are using approach from:…
Kamil Banaszczyk
  • 1,133
  • 1
  • 6
  • 23
1
vote
1 answer

unable to upload workspace packages and requirement.txt files on azure synapse analytics sparks pool

When trying to import python libraries at a spark pool level by applying an uploaded requirements.txt file and custom packages, I get the following error with no other details: CreateOrUpdateSparkComputeFailed Error occured while processing the…
0
votes
0 answers

Efficiente way to use spark dependency on spark operator

I'm trying to run a spark application using spark operator for my example I need some spark packages however every time that I deploy I need to re-download those packages that some times takes I long time to do. I want some efficient way so I don't…
0
votes
1 answer

Unable to import plugin to scala project

I added this to /project/plugins.sbt: resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/" addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6") in order to import…
AT181903
  • 11
  • 4
0
votes
1 answer

SPARK 2.0: spark-infotheoretic-feature-selection java.lang.NoSuchMethodError: breeze.linalg.DenseMatrix

I am trying to use MRMR feature of InfoGain third-party (https://github.com/sramirez/spark-infotheoretic-feature-selection) package of Spark. But my cluster is 2.0 and I am getting this exception. Even though I added all the required Jar files to…
0
votes
1 answer

Spark how to prefer class from packaged jar

I am using sbt assembly plugin to create a fat jar. I need some jars which are part of default hadoop/spark but with newer versions. I want spark worker jvm to prefer the version that is packaged with my fat jar file and not the default hadoop/spark…
Shubham Jain
  • 392
  • 1
  • 3
  • 15