When doing:
new MiniDFSCluster.Builder(config).build()
I get this exception:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:557)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:490)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:308)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:1020)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:739)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:536)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:595)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:762)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:746)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1438)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1107)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:978)
at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:807)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:467)
at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:426)
I want to use the Hadoop Minicluster to test my Hadoop HDFS (which does not throw this exception, see java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0).
In my Maven pom.xml I have these dependencies:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<!-- for unit testing -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
</dependency>
<!-- for unit testing -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
<scope>test</scope>
<classifier>tests</classifier>
</dependency>
I understood, I do not need to the specific 'hadoop-minicluster' dependency as it already comes with the above included hadoop-hdfs.
I am trying to build the MiniDFSCluster in my @BeforeAll.
I have used different configs for the builder:
config = new HdfsConfiguration(); / config = new Configuration();
And different ways to create a path for the baseDir:
config.set(miniDfsClusterHD.HDFS_MINIDFS_BASEDIR, baseDir);
Also, I downloaded the hadoop.dll and hdfs.dll and winuntils.exe in v2.6.0 and added the path to those in my environment variables.
I studied all the related issues I could find in stackoverflow (without success, obviously) and all guides and code examples I could find in the internet (there are a few and they all do it differently.)
Can somehow please help me, I am out of ideas.
UPDATE:
I am running the test with the following VM options (which should not be necessary, I think):
-Dhadoop.home.dir=C:/Temp/hadoop
-Djava.library.path=C:/Temp/hadoop/bin
I also tried to set the environment variables directly (which should not be necessary when using the VM options):
System.setProperty("hadoop.home.dir", "C:\\Temp\\hadoop-2.6.0");
System.setProperty("java.library.path", "C:\\Temp\\hadoop-2.6.0\\bin");