25

How can I view how many blocks has a file been broken into, in a Hadoop file system?

phs
  • 10,687
  • 4
  • 58
  • 84
London guy
  • 27,522
  • 44
  • 121
  • 179

4 Answers4

45

We can use hadoop file system check command to know the blocks for the specific file.

Below is the command:

hadoop fsck [path] [options]

To view the blocks for the specific file :

hadoop fsck /path/to/file -files -blocks
Phani
  • 3,267
  • 4
  • 25
  • 50
4

hadoop fsck filetopath

used the above commad in CDH 5. Got the below Error.

hadoop-hdfs/bin/hdfs: line 262: exec: : not found

Use the below command and it worked good

hdfs fsck filetopath

yoga
  • 1,929
  • 2
  • 15
  • 18
3

It is always a good idea to use hdfs instead of hadoop as 'hadoop' version is deprecated.

Here is the command with hdfs and to find the details on a file named 'test.txt' in the root, you would write

hdfs fsck /test.txt -files -blocks -locations

user1795667
  • 423
  • 4
  • 11
-4

This should work..

hadoop fs -stat "%o" /path/to/file
  • `%o` is the block size, not the number of blocks, per http://hadoop.apache.org/docs/r2.7.0/hadoop-project-dist/hadoop-common/FileSystemShell.html#stat – Nickolay May 25 '15 at 02:34