1

I am almost new in spark. I want to connect pyspark to oracle sql, I am using the following pyspark code:

from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext, Row
import os

spark_config = SparkConf().setMaster("local").setAppName("Project_SQL")
sc = SparkContext(conf = spark_config)
sqlctx = SQLContext(sc)

os.environ['SPARK_CLASSPATH'] = "C:\Program Files (x86)\Oracle\SQL Developer 4.0.1\jdbc\lib.jdbc6.jar"


df = sqlctx.read.format("jdbc").options(url="jdbc:oracle:thin:@<>:<>:<>"
                                   , driver = "oracle.ojdbc6.jar.OracleDriver"
                                   , dbtable = "account"
                                   , user="...."
                                   , password="...").load()

But I get the following error:

An error occurred while calling o29.load.: 
java.lang.ClassNotFoundExceotion : oracle.ojdbc6.jar.OracleDriver

I searched a lot and try several ways that I found to change/correct the path to the driver but still got the same error.

Could anyone help me with this please?

Berke
  • 69
  • 2
  • 3
  • 12

2 Answers2

2

oracle.ojdbc6.jar.OracleDriver is not a valid driver class name for the Oracle JDBC driver. The name of the driver is oracle.jdbc.driver.OracleDriver. Just make sure that the jar-file of the Oracle driver is on the classpath.

Mark Rotteveel
  • 100,966
  • 191
  • 140
  • 197
  • 1
    It is saying: An error occurred while calling o29.load.: java.sql.SQLRecoverableException: IO Error: The Network Adapter could not establish the connection. – Berke Jun 22 '17 at 10:21
  • Do you know what the reason could be? – Berke Jun 22 '17 at 10:25
  • @zahrarabiei https://stackoverflow.com/questions/12574414/io-error-the-network-adapter-could-not-establish-the-connection ; https://stackoverflow.com/questions/18037440/the-network-adapter-could-not-establish-the-connection-oracle-11g (and others); as you seem to have anonymised the JDBC url in your question, I can't answer further. If these links don't help, try a google search first, otherwise post a new question. – Mark Rotteveel Jun 22 '17 at 12:01
-1

Try placing the oracle JDBC connectivity jar in jars folder under spark

Vickyster
  • 163
  • 3
  • 5
  • 18