I can run this command for HDFS:
hadoop fs -ls /user/hive/warehouse/databasename.db/tablename
How to write command in Spark to show all files under specific folder in HDFS?
Thanks.
I can run this command for HDFS:
hadoop fs -ls /user/hive/warehouse/databasename.db/tablename
How to write command in Spark to show all files under specific folder in HDFS?
Thanks.
OK, the below scala
code just give you a function to print all the hdfs files under a parent path.You can improve it according to your needs.
def getAllPaths(parentPath:String, fs: FileSystem) = {
val fileStatus = fs.listStatus(new Path(parentPath))
for( file<- fileStatus) {
println(file.getPath.toString)
}
}