I have opened Spark-Shell. In shell we already have an variable -
spark: org.apache.spark.sql.SparkSession
I have a third party Jar which has package name starting with "spark", like -
spark.myreads.one.KafkaProducerWrapper
When I try to import above package on spark shell then 'm getting exception -
scala> import spark.myreads.one.KafkaProducerWrapper
<console>:38: error: value myreads is not a member of org.apache.spark.sql.SparkSession
import spark.myreads.one.KafkaProducerWrapper
How can I import such a package on Spark-Shell resolving above conflict.
I'm using Spark-2.0.0
, JDK-1.8
and Scala -2.11