I have a very strange problem when starting hadoop.
When I call start-dfs.sh
using absolute path /usr/local/hadoop/etc/hadoop/sbin/start-dfs.sh
, it starts without any problem.
But as I add hadoop into my environment variables :
export HADOOP_HOME=/usr/local/hadoop
export CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath):$CLASSPATH
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
I would like to call it directly using start-dfs.sh
. But when I start like this, it throws error :
20/10/26 16:36:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
20/10/26 16:36:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I wonder what is the problem ? I have all my Java home and core-site.xml
well configured. Why it's not working if I start it directly from bash ?