I need to put about 20 million entries into a HashMap. I chose TLongObjectHashMap as per :Why is Java HashMap slowing down?
The code looks like:
StringBuilder sb = new StringBuilder("");
StringBuilder value = new StringBuilder("");
TLongObjectHashMap<String> map = new TLongObjectHashMap<String>();
in = new FileInputStream(new File(inputFile));
br = new BufferedReader(new InputStreamReader(in), 102400);
for (String inLine; (inLine = br.readLine()) != null;) {
sb.setLength(0);
for (i = 0; i < 2; i++) {
for (j = 1; j < 12; j++) {
sb.append(record.charAt(j));
}
}
for (k = 2; k < 4; k++) {
value.append(record.charAt(k));
}
for (k = 7; k < 11; k++) {
value.append(record.charAt(k));
}
map.put(Long.parseLong(sb.toString()), value.toString());
value.delete(0, value.length());
}
I used the GNU Trove. Still, becomes extremely slow and almost stops at about 15 million entries. There is no OutOfMemoryError yet. What is the problem?
I have no option to use DB for this.
Note: the values like are 1, 12, 2,4, etc are calculated before this loop and stored in a variable, which in turn will be used here. I just replaced them with some values now