2

I can`t run mt Spark app because java.lang.NoClassDefFoundError: org/postgresql/Driver

I did same like How can I connect to a postgreSQL database into Apache Spark using scala? but when I try to start my app I get this Exception.

Exception in thread "main" java.lang.NoClassDefFoundError: org/postgresql/Driver
    at SparkRecommendationMatrix.<init>(SparkRecommendationMatrix.scala:31)
    at Main$.main(Main.scala:26)
    at Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.postgresql.Driver
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 10 more

My bild.sbt file:

name := "untitled12"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
  "org.postgresql" % "postgresql" % "9.2-1003-jdbc4",
  "org.apache.spark" % "spark-mllib_2.10" % "1.0.0"
)

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

and my java code

val classes = Seq(
    getClass, // To get the jar with our own code.
    classOf[org.postgresql.Driver] // To get the connector.
  )

  val jars = classes.map(_.getProtectionDomain().getCodeSource().getLocation().getPath())

  // set up environment
  val conf = new SparkConf().setAppName(name).setJars(jars)
  //.setMaster("spark://192.168.10.122:7077")
  val sc = new SparkContext(conf)
Community
  • 1
  • 1
Goko Gorgiovski
  • 1,364
  • 2
  • 13
  • 20
  • It's quite obvious: you apparently do **not** have the Postgres JDBC driver available to your Spark application/installation. Jens answer is absolutely correct. –  Oct 06 '14 at 12:09
  • Maybe this is caused by the fact that the current 9.2 driver has the build number `1004` not `1003`? –  Oct 06 '14 at 12:11
  • I'm having the same issue. Did you every figure out a way to fix this? – poorman Apr 07 '15 at 22:43

1 Answers1

1

I ran into a similar problem. Solved it by passing postgresql.jar as a parameter of spark-submit:

spark-submit --class <<class>> --jars /full_path_to/postgressql.jar My_File.jar
Xavier Guihot
  • 54,987
  • 21
  • 291
  • 190
Anand
  • 21
  • 1