1

I have a task to use Spark as Cache in my webapp, I went through sample java code shared on its documentation and was able to run it standalone.

But when I initialize it inside my webapp (in a servlet init() ) also tried to initialize it in spring context but either way it failed.

I am using Apache Spark 1.1.0 using pre-built package for Hadoop 2.4 (spark-assembly-1.1.0-hadoop2.4.0.jar)

My INIT()

init() {
        System.out.println("BaseService initialized");
        SparkConf conf = new SparkConf().setAppName("Spark").setMaster("local[*]");
        sparkContext = new JavaSparkContext(conf);
        cacheMap = new HashMap<>();     
    }

Error: (When Tomcat is used as Server)

 WEB-INF\lib\spark-assembly-1.1.0-hadoop2.4.0.jar) - jar not loaded. See Servlet Spec 2.3, section 9.7.2. Offending class: javax/servlet/Servlet.class

Error : (when I try using Jetty as Server)

     Caused by: java.lang.Exception: Could not find resource path for Web UI: org/apache/spark/ui/static
at org.apache.spark.ui.JettyUtils$.createStaticHandler(JettyUtils.scala:133)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:70)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)

Any help in this regard will be aprciated..

Waseem Akhtar
  • 153
  • 1
  • 1
  • 12
  • I got this error when The assembled Jar was not properly built. try creating a fat jar using this : http://stackoverflow.com/questions/574594/how-can-i-create-an-executable-jar-with-dependencies-using-maven then run it using spark-submit – Aditya Pawade Nov 12 '14 at 19:32
  • As I already mentioned that i can run all samples in the documentation standalone ( either by spark-submit or directly from main method ). Main issue is I am not able initialize "spark-context" within my application, even using same set of dependencies which I used in samples – Waseem Akhtar Nov 13 '14 at 05:24
  • 1
    Yes. But according to the error, it seems the jar is not being passed to the cluster when running this app. Have you tried mentioning the jar while creating SparkConf ? new SparkConf().setJars(new String[] {path of your jar dependencies}). Maybe there is a difference how dependencies are sent when you run it standalone vs how webapps use it – Aditya Pawade Nov 13 '14 at 05:52

1 Answers1

0

One probable cause of the above issue should be you had the spark jars which have servlet api i.e javax.servlet.* mostly you could have used maven to get the spark jars, now you tried to place your application in tomcat container and tried to run it while loading the classes tomcat found that this particular jar have servlet jar and did not load this jar so this jar which has both the spark classes and servlet is not loaded eventually ti could not find the JavaSparkContext class one way to over come is remove the servlet api from the spark jar

Din Reddy
  • 16
  • 1