The sparkcontext is created as below
SparkConf sparkConf = new SparkConf().setAppName(args[0]);
snappySes = new SnappySession(new SparkSession.Builder().config("spark.snappydata.connection", "localhost:1527").getOrCreate())
Read snappy data
snappySes.table("SNAPPY_COL_TABLE").show(10);
Job submitted as below
/usr/hdp/2.6.2.0-205/spark2/bin/spark-submit --conf --conf snappydata.connection=localhost:1527 --conf spark.ui.port=0 --master local[*] --driver-memory 2g --jars --deploy-mode client --conf spark.driver.extraClassPath=/root/snappydata-1.0.1-bin/jars/* --conf spark.executor.extraClassPath=/root/snappydata-1.0.1-bin/jars/* --class myclass
Job is connecting to snappydata ,logs below
Initializing SnappyData in cluster mode: Smart connector mode: sc = org.apache.spark.SparkContext@164d01ba, url = jdbc:snappydata://localhost[1527]/
But fails with table not found.This is pointing to a different store different tables are listed
If the same job is submitted with snappy's sparksubmit. Works as expected. Only change is submitting job is
/usr/hdp/2.6.2.0-205/spark2/bin/spark-submit --- fails
/root/snappydata-1.0.1-bin/bin/spark-submit ---- Pass