1

I have a class in scala that needs a spark context for its methods. Therefore in the class statement I used an implicit spark context and the code compiled fine:

class Test (implicit sc: SparkContext) {


}

However, when I tried to instantiate the object, I got the following error:

val inst: Test = new Test()
error: could not find implicit value for parameter sc: org.apache.spark.SparkContext

Is this the wrong way to use a spark context within a class?

mt88
  • 2,855
  • 8
  • 24
  • 42
  • It simply means there is no implicit `SparkContext` in a given scope. Where do you define it? – zero323 May 25 '16 at 23:42
  • When I type in SparkContext or sc into the shell I get org.apache.spark.SparkContext = org.apache.spark.SparkContext@ back. Therefore I assume there is one in the global frame. Is there a reason it can't see it? / How do I make it see it? – mt88 May 25 '16 at 23:47
  • Spark context you get in REPL is not `implicit`. Putting aside if it make sense or not you can start with something like this `implicit val isc = sc`. – zero323 May 26 '16 at 00:06

0 Answers0