1

I am using the spark-solr client found here https://github.com/lucidworks/spark-solr

I am using the sbt-assembly plugin (https://github.com/sbt/sbt-assembly) to package my fat jar. I used the instructions from this post How to build an Uber JAR (Fat JAR) using SBT within IntelliJ IDEA?

And my build.sbt file is

name := "SolrSpark"

version := "1.0"

scalaVersion := "2.10.4"


libraryDependencies ++= Seq(
  "com.lucidworks.spark" % "spark-solr" % "2.0.0"
)

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) => {
  case PathList("META-INF", xs@_*) => MergeStrategy.discard
  case x => MergeStrategy.first
}
}

Then when I perform

sbt assembly

My jar get's packaged without error, but when I try to run the jar with

java -jar SolrSpark-assembly-1.0.jar 

I get the error

 [main] ERROR SparkContext  - Error initializing SparkContext.
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
    at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
    at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206)
    at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:169)
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:505)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1988)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1979)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:266)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
    at com.xendo.solr.SolrSparkWordCount$delayedInit$body.apply(SolrSparkWordCount.scala:15)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.App$$anonfun$main$1.apply(App.scala:71)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
    at scala.App$class.main(App.scala:71)
    at com.xendo.solr.SolrSparkWordCount$.main(SolrSparkWordCount.scala:7)
    at com.xendo.solr.SolrSparkWordCount.main(SolrSparkWordCount.scala)

which is occurring at the line in which I define the spark context

  val sc = new SparkContext(conf)

Does anyone know what is causing this error?

Community
  • 1
  • 1
user1893354
  • 5,778
  • 12
  • 46
  • 83

1 Answers1

0

Launch your application using the spark-submit script as this will take care of setting up the classpath with Spark and its required dependencies.

Below are the launch command args for Spark 1.6.x for further details see the spark documentation.

./bin/spark-submit \
  --class <main-class> \
  --master <master-url> \
  --deploy-mode <deploy-mode> \
  --conf <key>=<value> \
  ... # other options
  <application-jar> \
  [application-arguments]
Dylan Hogg
  • 3,118
  • 29
  • 26