If using maven then the following way of building jar with dependencies might solve your issue.
Add the spark dependencies like below:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
<scope>${spark.scope}</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.2.1</version>
</dependency>
Then configure your maven profiles as below:
<profiles>
<profile>
<id>default</id>
<properties>
<profile.id>dev</profile.id>
<spark.scope>compile</spark.scope>
</properties>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
</profile>
<profile>
<id>test</id>
<properties>
<profile.id>test</profile.id>
<spark.scope>provided</spark.scope>
</properties>
</profile>
<profile>
<id>online</id>
<properties>
<profile.id>online</profile.id>
<spark.scope>provided</spark.scope>
</properties>
</profile>
</profiles>
Add the followign plugin:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
Then buld your jar using mvn clean install -Ponline -DskipTests
. This should solve your issue