0

I have created a file with the results of a sequence of map-reduce jobs. The program i' ve made iteratively outputs some results. I want to append those data in the result file using the java api. I have tried fs.append but it doesn't work. For the time being I am using built-in libraries of java (eclipse 4.2.2) and when I'm ok with the debugin I'll make it a jar and throw it in a cluster.

First of all, is "append" accepted in hdfs? And if yes can anyone tell me how it's done? Thnx in advance.

The code I am using to do this job is the following:

try{
    Path pt = new Path("/home/results.txt");
    FileSystem fs = FileSystem.get(new Configuration());
    BufferedWriter br = new BufferedWriter(new OutputStreamWriter(fs.append(pt)));
    String line = "something";
    br.write(line);
    br.close();
} catch (Exception e) {
    System.out.println("File not found");
}
kostas
  • 63
  • 1
  • 7

1 Answers1

2

Early versions of HDFS had no support for an append operation. Once a file was closed, it was immutable and could only be changed by writing a new copy with a different filename.

see more information here

if you using old version this is work for me ......

 BufferedReader bfr=new BufferedReader(new InputStreamReader(hdfs.open(path)));     //open file first
            String str = null;
            BufferedWriter br=new BufferedWriter(new OutputStreamWriter(hdfs.create(path,true))); 
            while ((str = bfr.readLine())!= null)
            {
                br.write(str); // write file content
                br.newLine();
               System.out.println("   ->>>>>  "+str);
             }
            br.write("Hello     ");  // append into file
            br.newLine();
            br.close(); // close it
Rishi Dwivedi
  • 908
  • 5
  • 19
  • Thanks! I've seen that. What I did to fix my problem was to delete and re-write the file along with anything I want to append everytime. That's ok for now but I'm still interested in using fs.append. Thanks anyway! – kostas Jun 04 '14 at 17:25