I'm working on a spark application and have already done the logic, yet I have little-to-none experience with creating a standalone application.
I have to have a runnable jar, yet when I try to run scala path/to/my/jar I get
java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
This is my build.sbt
name := "Spark_Phase2"
version := "0.1"
organization := "bartosz.spark"
scalaVersion := "2.11.8"
libraryDependencies+= "org.apache.spark" %% "spark-core" % "2.3.0"
libraryDependencies+= "org.apache.spark" %% "spark-sql" % "2.3.0"
From what I have seen there is something wrong with dependencies but I could not figure excatly what I have to do to make it runnable.
What puzzles me even more is that sbt run
does runs the code fine. So it would be nice if someone could write a step-by-step solution to this :)
And one more thing, I have to take a couple command-line parameters with flags and I have never done it before, does anyone have any good docs/tutorial on this?