Hadoop sequence file is really strange. I pack the images into sequence file and can't recovery image. I do some simple test. And I found the size of bytes even not same before and after use sequence file.
Configuration confHadoop = new Configuration();
FileSystem fs = FileSystem.get(confHadoop);
String fileName = args[0];
Path file = new Path(fs.getUri().toString() + "/" + fileName);
Path seqFile = new Path("/temp.seq");
SequenceFile.Writer writer = null;
FSDataInputStream in = null;
try{
writer = SequenceFile.createWriter(confHadoop,Writer.file(seqFile), Writer.keyClass(Text.class),
Writer.valueClass(BytesWritable.class));
in = fs.open(file);
byte buffer[] = IOUtils.toByteArray(in);
System.out.println("original size ----> " + String.valueOf(buffer.length));
writer.append(new Text(fileName), new BytesWritable(buffer));
System.out.println(calculateMd5(buffer));
writer.close();
}finally{
IOUtils.closeQuietly(in);
}
SequenceFile.Reader reader = new SequenceFile.Reader(confHadoop, Reader.file(seqFile));
Text key = new Text();
BytesWritable val = new BytesWritable();
while (reader.next(key, val)) {
System.out.println("size get from sequence file --->" + String.valueOf(val.getLength()));
String md5 = calculateMd5(val.getBytes());
Path readSeq=new Path("/write back.png");
FSDataOutputStream out = null;
out = fs.create(readSeq);
out.write(val.getBytes()); //YES! GOT THE ORIGIANL IAMGE
out.close();
System.out.println(md5);
.............
}
The output shows I got the same number of bytes, and after I write the image back to the local disk, I am sure I got the original image. But why the MD5 value is not same?
What's wrong I made here?
14/04/22 16:21:35 INFO compress.CodecPool: Got brand-new compressor [.deflate]
original size ----> 485709
c413e36fd864b27d4c8927956298edbb
14/04/22 16:21:35 INFO compress.CodecPool: Got brand-new decompressor [.deflate]
size get from sequence file --->485709
322cce20b732126bcb8876c4fcd925cb