3

I'm trying to load the teradata jar file in spark but can't get it to load. I start spark shell like this:

spark-shell --jars ~/*.jar --driver-class-path ~/*.jar

in there I have a jar file called terajdbc4.jar

when spark shell starts...I do this

scala> sc.addJar("terajdbc4.jar")
15/12/07 12:27:55 INFO SparkContext: Added JAR terajdbc4.jar at http://1.2.4.4:41601/jars/terajdbc4.jar with timestamp 1449509275187

scala> sc.jars
res1: Seq[String] = List(file:/home/user1/spark-cassandra-connector_2.10-1.0.0-beta1.jar)

scala> 

but its not there in the jars. why is it missing still?

EDIT:

ok. I got the jar to load, but I'm getting this error:

java.lang.ClassNotFoundException: com.teradata.jdbc.TeraDriver

I do the following:

scala> sc.jars
res4: Seq[String] = List(file:/home/user/terajdbc4.jar)

scala> import com.teradata.jdbc.TeraDriver
import com.teradata.jdbc.TeraDriver

scala> Class.forName("com.teradata.jdbc.TeraDriver")
res5: Class[_] = class com.teradata.jdbc.TeraDriver

and then this:

val jdbcDF = sqlContext.load("jdbc", Map(
  "url" -> "jdbc:teradata://dbinstn, TMODE=TERA, user=user1, password=pass1",
  "dbtable" -> "db1a.table1a",
  "driver" -> "com.teradata.jdbc.TeraDriver"))

and then I get this:

java.lang.ClassNotFoundException: com.teradata.jdbc.TeraDriver
lightweight
  • 3,227
  • 14
  • 79
  • 142

1 Answers1

0
spark-shell --jars ~/*.jar --driver-class-path ~/*.jar

please refer to Using wildcards in java classpath
The wildcards like *.jar is not supported, please try to add specific jar file path.

Community
  • 1
  • 1
Shawn Guo
  • 3,169
  • 3
  • 21
  • 28