0

Could anybody tell me if any Java application can be deployed on Apache Spark or any criteria (code modification,..) must be taken into account?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
chufabit
  • 105
  • 10
  • like this? http://stackoverflow.com/questions/34367085/how-to-add-our-custom-library-to-apache-spark – HRgiger Mar 09 '16 at 16:31

1 Answers1

0

You can import/run any java libraries in your spark application. You can start your spark application through Java api or scala api.

Spark has its own programming model, you have to follow that and add extra things you like

Grant
  • 500
  • 1
  • 5
  • 18
  • Thanks @Grant, so, if I have my own .jar then I could create a spark application through Java api which instances my .jar? – chufabit Mar 09 '16 at 16:35
  • Yes or No. You will build a fat jar which contains all of the dependencies of your application. Your own jar(or its contents) will be assembled into the final fat jar. Just think of your jar as the dependency – Grant Mar 09 '16 at 16:41