4

I'm trying to run all hadoop daemons with the command

./start-all.sh

from the main host cluster. As a result of execution several times gave out

starting namenode, logging to /export/hadoop-1.0.1/libexec/../logs/hadoop--namenode-one.out
192.168.1.10: starting datanode, logging to /export/hadoop-1.0.1/libexec/../logs/hadoop-hadoop-datanode-myhost2.out
192.168.1.10: Error: JAVA_HOME is not set.

And it happened several times even after execution on the virtual machine to the address of 192.168.1.10 commands

hadoop@myhost2:~$ export JAVA_HOME=/opt/jdk1.7.0_06

Tell please how rigidly once and for all to fix the JAVA_HOME variable.

user1851132
  • 341
  • 1
  • 6
  • 15

3 Answers3

1

Put the line export JAVA_HOME=/path/to/java at the beginning of your start-all.sh script, and that should do it.

1

$ vi ~/.bash_proflle

append this line to the file

export JAVA_HOME=/opt/jdk1.7.0_06

that will make permanent changes to JAVA_HOME environment variable.

kaysush
  • 4,797
  • 3
  • 27
  • 47
1

You shoud set JAVA_HOME in hadoop-env.sh file also. Which is in $HADOOP_INSTALL/hadoop/conf directory of hadoop. by default JAVA_HOME setting line is commented

hadoop-env.sh - This file contains some environment variable settings used by Hadoop. You can use these to affect some aspects of Hadoop daemon behavior, such as where log files are stored, the maximum amount of heap used etc. The only variable you should need to change in this file is JAVA_HOME

Or You can add it in following file in hadoop account.

~/.bash_proflle
Satish
  • 16,544
  • 29
  • 93
  • 149