2

When I run HiveRead.java from intellij ide I can successfully run and get result. Then I created jar file ( It's a maven project ) , then I tried to run from IDE, it gave me

ClassLoaderResolver for class "" gave error on creation : {1}

Then I looked at SO answers and found I had to add datanulcues jars, I did something like this

java -jar /home/saurab/sparkProjects/spark_hive/target/myJar-jar-with-dependencies.jar --jars jars/datanucleus-api-jdo-3.2.6.jar,jars/datanucleus-core-3.2.10.jar,jars/datanucleus-rdbms-3.2.9.jar,/home/saurab/hadoopec/hive/lib/mysql-connector-java-5.1.38.jar

Then I got this error

org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "datanucleus" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.

Somewhere I found I should do spark-submit. So I did like this

./bin/spark-submit --class HiveRead --master yarn  --jars jars/datanucleus-api-jdo-3.2.6.jar,jars/datanucleus-core-3.2.10.jar,jars/datanucleus-rdbms-3.2.9.jar,/home/saurab/hadoopec/hive/lib/mysql-connector-java-5.1.38.jar --files /home/saurab/hadoopec/spark/conf/hive-site.xml /home/saurab/sparkProjects/spark_hive/target/myJar-jar-with-dependencies.jar

Now I get new type of error

Table or view not found: `bigmart`.`o_sales`; 

HELP ME !! :)

I have copied my hive-site.xml to /spark/conf, started hive-metastore service ( hiveserver2 --service metastore )

Here is HiveRead.Java code if anyone is interested.

Saurab
  • 1,931
  • 5
  • 20
  • 33

1 Answers1

0

Spark session is not able to read the hive directory.

Provide the hive-site.xml file path with spark-submit command as below.

For hortonworks - file path /usr/hdp/current/spark2-client/conf/hive-site.xml

pass it as --files /usr/hdp/current/spark2-client/conf/hive-site.xml in spark-submit command.

Sandeep Khot
  • 301
  • 3
  • 5
  • Welcome to stack overflow. Please consider using the code notation for your code in the answer above. – sao Feb 15 '20 at 17:44