Hi when I tried to put a file in hdfs I am getting an error which says Name Node is in safe mode. I excecuted the command ./bin/hdfs dfsadmin -safemode leave and I got the message Safe mode is OFF. But the problem remains the same while putting a file in hdfs. Can anyone help me to fix this issue?
Asked
Active
Viewed 475 times
2
-
Make sure that at least 1 active datanode running....... – Kumar Aug 03 '15 at 11:14
-
data node is running @127.0.0.1:50010 . – Thulasi Accottillam Aug 03 '15 at 11:18
-
Ensure all nodes are running... Checked logs? – Kumar Aug 03 '15 at 11:31
-
Provide more details about your cluster configuration, how many nodes you have – Mikhail Golubtsov Aug 03 '15 at 11:35
-
I am running on a single system. I am very new to hadoop system. Where I can find the log files? – Thulasi Accottillam Aug 03 '15 at 12:04
-
See this page: http://stackoverflow.com/questions/13729510/safemodeexception-name-node-is-in-safe-mode – Gaurav Dave Aug 03 '15 at 12:23
-
I already read that link Mr. Gaurav Dave. From there I got the command to quit from safemode. But I didn't understand how to configure in hdfs_site.xml – Thulasi Accottillam Aug 03 '15 at 13:11
-
thank you all for the help. May be because of the time span required to start the data node. Now its working fine :) – Thulasi Accottillam Aug 03 '15 at 15:02
1 Answers
0
A month ago, I encounter the same question with you. And I solve it with the help of log. My question is disk capacity is not enough to use. So, in your problem , Please check your logs and find what you want, you can get some useful information from logs.
Like this:
/root/ec-perf-test/hadoop/logs
hadoop-root-namenode-node10.log
If you also have any question with your problem after you have already check your logs. Please feel free to contact with me.

hash-X
- 99
- 1
- 2
- 9