1

My /hdfs_data/dfs/dn/current/ is full which is putting my machine in unhealthy state. I am using cloudera version 5.5.2. How should I clean it ?

Thanks;

akaliza
  • 3,641
  • 6
  • 24
  • 31
  • try hdfs dfs -rmr – urug Mar 19 '16 at 14:40
  • on this data node, I do not have hadoop native library, so the command hdfs dfs -rmr is not working – akaliza Mar 19 '16 at 16:20
  • My question is how can I clean data from a datanode, currently my data node is dead and preventing to start cloudera. It is on File Systems, disk = "/dev/sdb" and Mount Point ="/hdfs_data" – akaliza Mar 20 '16 at 13:24
  • refer this to decommision a datnode using cloudera manager http://www.cloudera.com/documentation/enterprise/5-2-x/topics/cm_mc_decomm_host.html – urug Mar 20 '16 at 14:46
  • @urug how Decommissioning and Recommissioning Hosts is going to help to solve my problem of full disk? – akaliza Mar 20 '16 at 17:09
  • please clarify what you mean by dead data node? also what do you mean by "i don't have native hadoop library"? what kind of cluster is this. Please describe the cluster and specific problem – urug Mar 20 '16 at 22:07

1 Answers1

1

I've had this problem. If you don't care about the data on hdfs, then you can simply rm -R /dfs/dn/current on every datanode of your cluster and hdfs namenode -format, this will free up plenty of memory.

Also, take a look here.

pavel_orekhov
  • 1,657
  • 2
  • 15
  • 37