5

I opened up localhost:9870 and try to upload a txt file to the hdfs.

I see the error message below

Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
JDDD
  • 61
  • 2
  • 2
  • 7
  • Did you actually enable WebHDFS? This error is for a file listing, not uploading – OneCricketeer Feb 12 '18 at 05:41
  • @cricket_007 I think I did. I can open up localhost:9870 doesn’t that mean that I enable webhdfs? – JDDD Feb 12 '18 at 17:53
  • 9870 is the NameNode, not WebHDFS – OneCricketeer Feb 12 '18 at 17:55
  • https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#HDFS_Configuration_Options – OneCricketeer Feb 12 '18 at 17:56
  • @cricket_007 what command should I use to enable WebHDFS? Pardon me being stupid. – JDDD Feb 12 '18 at 17:58
  • It's not a command. It's a configuration property in the `hdfs-site.xml`, (which is true, by default). See the link. It mentions the properties. In any case, I've never actually used port 9870 to upload files. (My version of hadoop doesn't even have an upload feature there). Ambari or Hue are the popular web interfaces to do so. If you want to use webhdfs, it happens on port 50070 https://community.hortonworks.com/questions/139351/how-to-upload-a-file-to-hdfs-using-webhdfs-rest-ap.html – OneCricketeer Feb 12 '18 at 18:11
  • @cricket_007 thanks I ll check my hdfs-site.xml first. I remember I did the change between two configuration blocks. – JDDD Feb 12 '18 at 18:16
  • @cricket_007 also when I try to create an input and an output dir in HDFS. I got an error message said: "WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable" – JDDD Feb 12 '18 at 20:24
  • @cricket_007 and here's what I have in my hdfs-site.xml dfs.replication 1 – JDDD Feb 12 '18 at 20:27
  • Regarding the native libraries. Edit the hadoop env file https://stackoverflow.com/a/24927214/2308683 and your XML uses all the defaults if otherwise not set. So if that's the case, you need to look at the log directory for the namenode. Assuming Linux, try looking around /var/log/hadoop – OneCricketeer Feb 12 '18 at 23:14

5 Answers5

4

I had the same issue with JDK 9. The fix for me was to add this line in hadoop-env.sh

export HADOOP_OPTS="--add-modules java.activation"

Thats because java.activation package is deprecated in Java 9.

S2L
  • 1,746
  • 1
  • 16
  • 20
4

I got this to work with OpenJDK 13 by downloading hadoop 2.9.2 and copying the activation-1.1.jar file from that download into the $HADOOP_HOME/share/hadoop/yarn folder you're using for Hadoop 3. Then you have to run stop-dfs.sh and stop-yarn.sh and then start them both again. No need to edit any config files with this method since it will automatically be added to the classpath.

Alex W
  • 37,233
  • 13
  • 109
  • 109
  • 1
    Confirmed previous answer did not work for me (v11) but you solution did. You may want to update to 1.1.1 – Jackie Feb 16 '20 at 17:26
  • I would edit this to link to here instead of DLing all of the previous version... https://mvnrepository.com/artifact/javax.activation/activation/1.1.1 – Jackie Feb 16 '20 at 17:33
3

Just solved such a problem, I have multiple java versions and hadoop3.1.0.

you need to specify the java home variable in etc/hadoop/hadoop-env.sh, and the java version should be 1.8.

coldestlin
  • 77
  • 5
0

This occurs due to conflicting versions of Java and OpenJDK installed as a dependency of Homebrew. So it would be nice to uninstall Java. using the commands below

  1. Uninstall java

#!/bin/bash

sudo rm -rvf /Library/Java/JavaVirtualMachines/jdk<version>.jdk
sudo rm -rvf /Library/PreferencePanes/JavaControlPanel.prefPane
sudo rm -rvf /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin
sudo rm -rvf /Library/LaunchAgents/com.oracle.java.Java-Updater.plist
sudo rm -rvf /Library/PrivilegedHelperTools/com.oracle.java.JavaUpdateHelper
sudo rm -rvf /Library/LaunchDaemons/com.oracle.java.JavaUpdateHelper.plist
sudo rm -rvf /Library/Preferences/com.oracle.java.Helper-Tool.plist

you can check with [this link][1]

  1. Create symlink to point to Homebrew open JDK dependency

    sudo ln -sfn $(brew --prefix)/opt/openjdk@11/libexec/openjdk.jdk /Library/Java/JavaVirtualMachines/openjdk-11.jdk

  2. Check for Java path using

    $ /usr/libexec/java_home

    It generates a path like this

    /opt/homebrew/Cellar/openjdk@11/11.0.18/libexec/openjdk.jdk/Contents/Home

  3. update the Hadoop environment file with the OpenJDK path by using this command in the terminal

    $cd /opt/homebrew/Cellar/hadoop/3.3.1/libexec/etc/hadoop $code hadoop-env.sh

update the JAVA_HOME path with the following

export JAVA_HOME=/opt/homebrew/Cellar/openjdk@11/11.0.18/libexec/openjdk.jdk/Contents/Home

Bonus: Check your java path with echo $JAVA_HOME

Amaboh
  • 1
  • 2
0

Try to install Java version 11 (or lower) or 1.8. I changed to Java 1.8, and it solves my problem. Hadoop is not compatible with a java version higher than 11.