3
  1. In Hadoop2.6.0 I use command:

    ./bin/hadoop fs -rmr /output
    

    Then I got:

    rmr: Cannot delete /output. Name node is in safe mode.
    
  2. I use command:

    hdfs dfsadmin -safemode leave
    

    to resolve this problem, and I got the result:

    Safe mode is OFF
    
  3. But when I delete the file /output, I got the same error:

    rmr: Cannot delete /output. Name node is in safe mode.
    
  4. I see following in the logs:

    2015-04-07 03:10:18,362 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe mode is ON. Resources are low on NN.

    Please add or free up more resources then turn off safe mode manually.

    NOTE: If you turn off safe mode before adding resources, the NN will immediately return to safe mode.

    Use "hdfs dfsadmin -safemode leave" to turn safe mode off.

How Can I handle this problem?

Remi Guan
  • 21,506
  • 17
  • 64
  • 87
  • If you are using a single node cluster and get this error, check if `datanode` is online and how much free space you are having by typing this `df -h` in terminal. If you get the error in multiple node cluster, you need to check if all the `datanodes are online` (free disk space if necessary). – Rajesh N Apr 07 '15 at 11:05
  • This may be due to low disk space or else problem with datanode. – Kumar Apr 07 '15 at 11:59
  • I am using a single node cluster.`datanode`is online.I thought it's because of the low disk space.Only 42M Available.And How can i **free the space**?PS: after I use the command `hdfs dfsadmin -safemode leave`the safemode is still **on**. – Rick Vencent Apr 07 '15 at 12:23
  • Delete the contents in `/tmp` folder. It may free up some space. – Rajesh N Apr 07 '15 at 14:50
  • I remove all of the data and name file.so all information of the project disappear.And I reformat the namenode. – Rick Vencent Apr 08 '15 at 14:18

0 Answers0