I have a class in scala that needs a spark context for its methods. Therefore in the class statement I used an implicit spark context and the code compiled fine:
class Test (implicit sc: SparkContext) {
}
However, when I tried to instantiate the object, I got the following error:
val inst: Test = new Test()
error: could not find implicit value for parameter sc: org.apache.spark.SparkContext
Is this the wrong way to use a spark context within a class?