I have a task to use Spark as Cache in my webapp, I went through sample java code shared on its documentation and was able to run it standalone.
But when I initialize it inside my webapp (in a servlet init() ) also tried to initialize it in spring context but either way it failed.
I am using Apache Spark 1.1.0 using pre-built package for Hadoop 2.4 (spark-assembly-1.1.0-hadoop2.4.0.jar)
My INIT()
init() {
System.out.println("BaseService initialized");
SparkConf conf = new SparkConf().setAppName("Spark").setMaster("local[*]");
sparkContext = new JavaSparkContext(conf);
cacheMap = new HashMap<>();
}
Error: (When Tomcat is used as Server)
WEB-INF\lib\spark-assembly-1.1.0-hadoop2.4.0.jar) - jar not loaded. See Servlet Spec 2.3, section 9.7.2. Offending class: javax/servlet/Servlet.class
Error : (when I try using Jetty as Server)
Caused by: java.lang.Exception: Could not find resource path for Web UI: org/apache/spark/ui/static
at org.apache.spark.ui.JettyUtils$.createStaticHandler(JettyUtils.scala:133)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:70)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)
Any help in this regard will be aprciated..