I'm trying to write a simple Scala code that queries Hive data located on a remote cluster. My code will be deployed to a clusterA but has to query a Hive table located on clusterB. I'm developing this in my local Eclipse and getting the following error
org.apache.spark.sql.AnalysisException: Table not found: `<mydatabase>`.`<mytable>`;
The relevant part of my code is below
val conf = new SparkConf().setAppName("Xing")
.setMaster("local[*]")
conf.set("hive.metastore.uris","thrift://<clusterB url>:10000")
val sc = SparkContext.getOrCreate(conf)
val hc = new HiveContext(sc)
val df = hc.sql("select * from <mydatabase>.<mytable>")
I suspect it is a configuration issue but I may be wrong. Any advise would be greatly appreciated.