I ran into this issue, my code written in scala, and I wanna run the jar on hadoop with command:
like:
./bin/hadoop jar testing/learning-yarn-1.0.0.jar com.learning.yarn.simpleapp.SimpleApp
It report error:
Exception in thread "main" java.io.FileNotFoundException: /var/folders/dy/kgryx_f11g1fdqpcnc8jl
m840000gp/T/hadoop-unjar5812913115154946721/META-INF/LICENSE (Is a directory)
I modify my maven configuration base on Hadoop java.io.IOException: Mkdirs failed to create /some/path
And now soved it.
My maven shade plugin configuration is like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<minimizeJar>true</minimizeJar>
<!-- <shadedArtifactAttached>true</shadedArtifactAttached>-->
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ApacheLicenseResourceTransformer">
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>log4j.properties</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
<exclude>META-INF/LICENSE*</exclude>
<exclude>license/*</exclude>
</excludes>
</filter>
<filter>
<artifact>com.typesafe.akka</artifact>
<includes>
<include>reference.conf</include>
</includes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
if I add true it will report scala class not found, comment out this config worked.
Good luck.