I'm running into an issue when trying to do repeat queries with Delta Lake tables in Zeppelin. This code snippet runs without any problems the first time through:
import io.delta.tables._
val deltaTable = DeltaTable.forPath("s3://bucket/path")
deltaTable.toDF.show()
But when I try to run it a second time, it fails with this error:
java.lang.IllegalArgumentException: Could not find active SparkSession
at io.delta.tables.DeltaTable$$anonfun$1.apply(DeltaTable.scala:620)
at io.delta.tables.DeltaTable$$anonfun$1.apply(DeltaTable.scala:620)
at scala.Option.getOrElse(Option.scala:121)
at io.delta.tables.DeltaTable$.forPath(DeltaTable.scala:619)
... 51 elided
I can restart the Spark interpreter and run the query again, but this is a huge impediment to development. Does anyone know why this is happening and whether there is a workaround that doesn't involve restarting the interpreter every time I want to run a new query?