17

I have installed Hadoop and SSH on my laptop. "ssh localhost" works fine. After formatting HDFS, I tried to start hadoop.

munichong@GrindPad:~$ sudo /usr/sbin/start-all.sh
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
root@localhost's password: 
root@localhost's password: localhost: Permission denied, please try again.

localhost: Permission denied (publickey,password).

It requires password. My role is "munichong". But munichong's password does not work here. Here, my role has changed to "root". I do not know whether I missed something here.

Is there anyone can help me?

Thanks!

KARTHIKEYAN.A
  • 18,210
  • 6
  • 124
  • 133
Munichong
  • 3,861
  • 14
  • 48
  • 69
  • 1
    if you are executing it with `sudo`, then of course it would expect you to be the `root` and want root's password and not yours! Or am I missing here something terribly! – Amar Mar 04 '13 at 16:08
  • 1
    @Amar You are not wrong, but that is not the problem the OP is referring to. The problem the OP is referring to is caused by the start script connecting to localhost when starting Hadoop. It is effectively SSHing into itself, causing the need to put in a password if SSH keys are not setup. – James Mchugh Mar 01 '19 at 20:54

6 Answers6

23

Solution:

1) Generate ssh key without password

$ ssh-keygen -t rsa -P ""

2) Copy id_rsa.pub to authorized-keys

$  cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys

3) Start ssh localhost

$ ssh localhost

4) now go to the hadoop sbin directory and start hadoop

$./start-all.sh 
./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-namenode-amtex-desktop.out
localhost: starting datanode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-datanode-amtex-desktop.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/amtex/Documents/installed/hadoop/logs/hadoop-amtex-secondarynamenode-amtex-desktop.out
starting yarn daemons
starting resourcemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-resourcemanager-amtex-desktop.out
localhost: starting nodemanager, logging to /home/amtex/Documents/installed/hadoop/logs/yarn-amtex-nodemanager-amtex-desktop.out

5)password not asking

$ jps 
12373 Jps
11823 SecondaryNameNode
11643 DataNode
12278 NodeManager
11974 ResourceManager
11499 NameNode
KARTHIKEYAN.A
  • 18,210
  • 6
  • 124
  • 133
  • I was asked to input password for ssh localhost, but I don't have that password, password for the user account is not fine. – Pythoner Dec 22 '16 at 03:46
  • while generating ssh, the password field is empty, so that while starting hadoop services its automatically start one by one without asking any password.@PythonNewHand – KARTHIKEYAN.A Dec 22 '16 at 08:38
7

As in above case munichong is a user (munichong@GrindPad)

  1. In my case: Login as hduser

  2. Firstly, remove the directorysudo rm -rf ~/.ssh

  3. Use to re-generate /.ssh directory with default setting:

    [hduser@localhost ~]$ ssh-keygen
    
  4. Here we do copy and paste the content of id_rsa.pub into authorised_keys file created by using above command)

    [hduser@localhost ~]$ sudo cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    
  5. [hduser@localhost ~]$ chmod -R 750 ~/.ssh/authorized_keys

  6. [hduser@localhost ~]$ ssh localhost

    The authenticity of host 'localhost (127.0.0.1)' can't be established. RSA key fingerprint is 04:e8:80:64:dc:71:b5:2f:c0:d9:28:86:1f:61:60:8a. Are you sure you want to continue connecting (yes/no)? yes

    Warning: Permanently added 'localhost' (RSA) to the list of known hosts. Last login: Mon Jan 4 14:31:05 2016 from localhost.localdomain

  7. [hduser@localhost ~]$ jps
    18531 Jps

  8. [hduser@localhost ~]$ start-all.sh

  9. All daemons start

Note: Sometime due to logs files other problem occur, in that case remove only dot out (.out) files from /usr/local/hadoop/logs/.

Jamal
  • 763
  • 7
  • 22
  • 32
3

I ran into the same problem. As Amar said,if you are running as sudo hadoop will ask for root password. If you don't have a root password, you can setup one using

 sudo passwd

below URL gives you more detail about user management.

https://help.ubuntu.com/12.04/serverguide/user-management.html

javamak
  • 41
  • 3
0

Create and Setup SSH Certificates Hadoop requires SSH access to manage its nodes, i.e. remote machines plus our local machine. For our single-node setup of Hadoop, we therefore need to configure SSH access to localhost.

So, we need to have SSH up and running on our machine and configured it to allow SSH public key authentication.

Hadoop uses SSH (to access its nodes) which would normally require the user to enter a password. However, this requirement can be eliminated by creating and setting up SSH certificates using the following commands. If asked for a filename just leave it blank and press the enter key to continue.

check this site

RevanthKrishnaKumar V.
  • 1,855
  • 1
  • 21
  • 34
bun
  • 1
0

It seems you have logged-in as root and invoking start-all.sh.

Instead, login as owner of directory $SPARK_HOME and invoke spark's
start-all.sh.

(or)

Let user hadoop be the owner of directory $SPARK_HOME and currently logged in as root, then command would be as follows:

sudo -u hadoop -c start-all.sh

Assumption:
a) PATH has reference to directory $SPARK_HOME/bin
b) Certificate based authentication is configured for user hadoop

AVA
  • 2,474
  • 2
  • 26
  • 41
-1

log in super user or root

:~ su

Password:

give permission to user

:~ sudo chown -R <log in user> /usr/local/hadoop/

for your example log in user: munichong

HADOOP_HOME = /usr/local/hadoop/

Plamen G
  • 4,729
  • 4
  • 33
  • 44
user3029620
  • 1,179
  • 8
  • 4