I am running a Spark Standalone cluster in company server in which i have 1 master and 10 worker. In my requirement i have build a spark jar job which will read data from Azure data lake store and using Spark SQL do some query over it and save the result in a database. All the VM in the spark cluster has Ubuntu . For developing a spark job i used my laptop which has windows and using eclipse i create a jar and copied in the master vm of cluster. Now to run the job i have to open a putty session to master VM of cluster and submit the job by using spark-submit.But what i want is to trigger the jar to start as job on cluster by using a java program which i run in my laptop.I just want to run a java program which will deploy the jar on cluster which is remotely some where else.
Asked
Active
Viewed 1,478 times
1
-
We understand what you want but there is no actual question there. – eliasah Jul 08 '17 at 16:20
-
Please see: https://stackoverflow.com/questions/40300222/best-practice-to-launch-spark-applications-via-web-application/40301106#40301106 It's not an exact duplicate, but it show different deployments – T. Gawęda Jul 08 '17 at 20:31
-
i want to know how it can be possible @eliasah – Jul 09 '17 at 10:33