I have tried different ways to create a large Hadoop SequenceFile with simply one short(<100bytes) key but one large (>1GB) value (BytesWriteable).
The following sample works for out-of-box:
which writes multiple random-length key and value with total size >3GB.
However, it is not what I am trying to do. So I modified it using hadoop 2.2.0 API to something like:
Path file = new Path("/input");
SequenceFile.Writer writer = SequenceFile.createWriter(conf,
SequenceFile.Writer.file(file),
SequenceFile.Writer.compression(CompressionType.NONE),
SequenceFile.Writer.keyClass(BytesWritable.class),
SequenceFile.Writer.valueClass(BytesWritable.class));
int numBytesToWrite = fileSizeInMB * 1024 * 1024;
BytesWritable randomKey = new BytesWritable();
BytesWritable randomValue = new BytesWritable();
randomKey.setSize(1);
randomValue.setSize(numBytesToWrite);
randomizeBytes(randomValue.getBytes(), 0, randomValue.getLength());
writer.append(randomKey, randomValue);
writer.close();
When fileSizeInMB>700MB, I am getting errors like:
java.lang.NegativeArraySizeException
at org.apache.hadoop.io.BytesWritable.setCapacity(BytesWritable.java:144)
at org.apache.hadoop.io.BytesWritable.setSize(BytesWritable.java:123)
...
I see this error being discussed, but not see any resolution. Note that int(2^32) can be as large as 2GB, it should not fail at 700MB.
If you have other alternative to create such large-value SequenceFile, please advise. I tried other approaches like IOutils.read from inputstream into a byte [], I got heap size or OOME.