0

I have opened Spark-Shell. In shell we already have an variable -

spark: org.apache.spark.sql.SparkSession

I have a third party Jar which has package name starting with "spark", like -

spark.myreads.one.KafkaProducerWrapper

When I try to import above package on spark shell then 'm getting exception -

scala> import spark.myreads.one.KafkaProducerWrapper
<console>:38: error: value myreads is not a member of org.apache.spark.sql.SparkSession
       import spark.myreads.one.KafkaProducerWrapper

How can I import such a package on Spark-Shell resolving above conflict.

I'm using Spark-2.0.0 , JDK-1.8 and Scala -2.11

dinesh028
  • 2,137
  • 5
  • 30
  • 47

1 Answers1

0

Use _root_ as beginning part like this:

import _root_.spark.myreads.one.KafkaProducerWrapper

王晓磊
  • 1
  • 1
  • 1