6

I have a df with a schema, also create a table in HBase with phoenix. What i want is to save this df to HBase using spark. I have tried the descriptions in the following link and run the spark-shell with phoenix plugin dependencies.

spark-shell --jars ./phoenix-spark-4.8.0-HBase-1.2.jar,./phoenix-4.8.0-HBase-1.2-client.jar,./spark-sql_2.11-2.0.1.jar 

However, i got an error saying even when i run the read function ;

val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "INPUT_TABLE",
         |   "zkUrl" -> hbaseConnectionString))


java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame

I have a feeling that i am on the wrong track. So if there is another way of putting data generated on spark into HBase, i will appreciate if you share it with me.

https://phoenix.apache.org/phoenix_spark.html

Saygın Doğu
  • 305
  • 1
  • 4
  • 17
  • http://apache-phoenix-user-list.1124778.n5.nabble.com/phenix-spark-Plugin-not-working-for-spark-2-0-td2590.html I've found this just now. It is almost a new issue, i think. – Saygın Doğu Nov 01 '16 at 14:18
  • Looks like I got the same issue: http://stackoverflow.com/questions/41494873/spark-connecting-to-phoenix-nosuchmethod-exception – Explorer Jan 06 '17 at 19:41

0 Answers0