2

I am very new to hadoop and am trying to set a psuedo-distributed mode execution with Hadoop-3.1.2. When I try to start yarn service I get the following error, please see the code snippet below.

$ sbin/start-yarn.sh 
Starting resourcemanagers on []
localhost: ERROR: Cannot set priority of resourcemanager process 13209
pdsh@manager-4: localhost: ssh exited with exit code 1
Starting nodemanagers
localhost: ERROR: Cannot set priority of nodemanager process 13366
pdsh@manager-4: localhost: ssh exited with exit code 1

I tried solutions at this stackoverflow question, which is very similar to my problem. But nothing worked out. A problem same as mine is posted in another forum here. However, no solution is available there as well.

Then, I tried another option which I am describing in the following text. I set following exports in the file sbin/start-yarn.sh.

export HDFS_NAMENODE_USER="root"
export HDFS_DATANODE_USER="root"
export HDFS_SECONDARYNAMENODE_USER="root"
export YARN_RESOURCEMANAGER_USER="root"
export YARN_NODEMANAGER_USER="root"

Then executed with sbin/start-yarn.sh and I got the following error. Please note that I have done all the settings for passwordless ssh for root@localhost.

$ sudo sbin/start-yarn.sh
Starting resourcemanagers on []
localhost: Permission denied (publickey).
pdsh@manager-4: localhost: ssh exited with exit code 255
Starting nodemanagers
localhost: Permission denied (publickey).
pdsh@manager-4: localhost: ssh exited with exit code 255
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
PHcoDer
  • 1,166
  • 10
  • 23
  • are you running the script under non-root user? the prompt char in your first command does not look right . run command ```id -a``` to confirm . – James Li Sep 18 '19 at 03:27
  • @bigdataolddriver In the first case I am not. In the second case (i.e. after exporting users as root) I am running with sudo. Please see the updated execution command in the question. – PHcoDer Sep 18 '19 at 03:30
  • @bigdataolddriver Thanks. When I tried 'sudo -i -u root sbin/start-yarn.sh' I get following error: '-bash: sbin/start-yarn.sh: No such file or directory'. Then I switched user by running 'sudo su' and then executed 'sbin/start-yarn.sh'. I got the same error as 'ssh exited with exit code 255' – PHcoDer Sep 18 '19 at 03:48
  • @bigdataolddriver I tried this as well: 'sudo -i -u root /home/uname/Hadoop/hadoop-3.1.2/sbin/start-yarn.sh' and got the same exit with 255 error. – PHcoDer Sep 18 '19 at 03:55
  • sorry I did not consider the side effect of $HOME dir switching. as i am not sure about the ssh auth priority under sudo command line . if you still want to running command as non-root , could you try ```bash -x sbin/start-yarn.sh``` and update with the output ? the output can help me understand why is it throwing error. – James Li Sep 18 '19 at 04:21
  • @bigdataolddriver The output when executed with bash -x is very huge. Cannot be sent in comment. Shall I email you? – PHcoDer Sep 18 '19 at 04:25
  • @bigdataolddriver sent messages. – PHcoDer Sep 18 '19 at 04:38

6 Answers6

2

I had the same issue, what helped me was the guide I found in this link!

The message "Cannot set priority of resourcemanager process" is misleading. I checked the resource manager logs and found that there was an error as follows

Unexpected close tag </property>; expected </configuration>
Matt Najarian
  • 151
  • 1
  • 8
1

Before running the start-yarn script, try the command: ssh localhost

qitian
  • 43
  • 6
1

When you have set passwordless ssh for localhost change the pdsh_rcmd_type value to ssh:

export PDSH_RCMD_TYPE=ssh
ephraimbuddy
  • 331
  • 2
  • 10
1

this error info actually very confuse me, later i find it happens because i have not correctly config cgroup. so you can firstly check your config make sure they are all right, you can check you resourcemanager logs

zhao yufei
  • 335
  • 3
  • 6
1

In addition to the steps suggested by zhao, ephraimbuddy and qitian. Please make sure that if you have a firewall running than the firewall is not blocking it in anyway. Also make sure that the user with which you are executing the command has enough permissions to update the priorities.

Chor Sipahi
  • 546
  • 3
  • 10
1

I had the same issue and was finally able to solve it. I got ResourceManager and NodeManager to run. If you're running Hadoop 3.3 and up, the issue might be with the java version you're using. hadoop_compatibility

" Apache Hadoop 3.3 and upper supports Java 8 and Java 11 (runtime only) Please compile Hadoop with Java 8. Compiling Hadoop with Java 11 is not supported"

Solution:

  • Try switching to java 8.
  • Then make sure your JAVA_HOME path variables are pointing to java 8 (including any JAVA_HOME path variables in hadoop-env.sh).
  • If the issue persists, check the error messages in resourcemanager log located in $HADOOP_HOME/logs/.
Sammy Oge
  • 11
  • 1