I created a Spark maven project in IntelliJ IDEA 2018 and tried to export an executable jar file of my main class. As I try to submit it to Yarn cluster, it errors The main class not found!
while the MANIFEST.MF
includes it:
Manifest-Version: 1.0
Main-Class: Test
I did the same with other processing engines like Apache Flink and IntelliJ could create an executable jar file that successfully runs on the cluster.
So in Spark case I always have to use maven-assembly-plugin
and export the jar file using the command:mvn clean compile assembly:single
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>Test</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
I guess it's because of spark dependencies format. I faced the same problem in creating a jar file from my written class using Spark dependencies(not executable). For example, adding spark-sql
dependency to Maven project eventuate in getting some other dependencies like spark-catalyst
. Is there any way to export Spark executable jar file using IntelliJ IDEA?