If being human readable is not a requirement, you could gasp resort to just making sure your data implements the Serializable interface and serializing the HashMap using an ObjectOutputStream. It's ugly but it would get the job done.
Another option would be DataInputStream and DataOutputStream. These allow you to read/write structured binary data.
Let's assume you have a HashMap, you could write it like this:
// realOutputStream should probably be a BufferedOutputStream
DataOutputStream output = new DataOutputStream( realOutputStream );
for (Map.Entry<Long, String> entry : map.entrySet()) {
// Write the key
output.writeLong(entry.getKey().longValue());
byte bytes[] = entry.getBytes("UTF-8");
// Writing the string requires writing the length and then the bytes
output.writeInt(bytes.length);
output.write(bytes, 0, bytes.length);
}
// realInputStream should probably be a BufferedInputStream
DataInputStream input = new DataInputStream ( realInputStream );
Map<Long, String> map = new HashMap<Long, String>();
while ( true ) {
try {
// read the key
long key = output.readLong();
// read the string length in bytes
int strlen = output.readInt();
// read the bytes into an array
byte buf[] = new byte[strlen];
output.readFully(buf, 0, strlen);
// Create the map entry.
map.put(Long.valueOf(key), new String(buf,"UTF-8"));
}
catch (EOFException e) {
// input is exhausted
break;
}
}
Keep in mind this is assuming you want to store and read the string as UTF. You could just as easily not supply a character set and use the jvm default encoding. Also notice that something with variable length like a String would require you to write the length of that data first before writing the actual data. This is so you can know how many bytes you need to read in to reconstruct that string.