I am trying to write a file into HDFS using scala File System Api getting following error on client as well as same on hadoop logs :
File /user/testuser/test.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
testuser has permission to read, write and execute . I checked the hdfs on ambari its up and running , not sure why getting this error
After doing google for error I have already tried stopping all service , formatting the namenode and starting all service etc , like it says on below link
Writing to HDFS could only be replicated to 0 nodes instead of minReplication (=1)
I still have same error . Any suggestion what I am doing wrong , I am new to hadoop so any suggestions will be appreciated .
following is the scala code I m using
def write(uri: String, filePath: String, data: Array[Byte]) = {
System.setProperty("HADOOP_USER_NAME", "usernamehere")
val path = new Path(filePath)
val conf = new Configuration()
conf.set("fs.defaultFS", uri)
conf.set("dfs.client.use.datanode.hostname", "true");
conf.addResource(new Path("/path/core-site.xml"));
conf.addResource(new Path("/path/hdfs-site.xml"));
val fs = FileSystem.get(conf)
val os = fs.create(path)
fs.setPermission(path,FsPermission.getDefault)
val out = new BufferedOutputStream(os)
println(data.length)
out.write(data)
out.flush()
out.close()
fs.close()
}
Thanks