I'm having trouble trying to load lots of data into different hashMaps. I use a different class that handles all of the decoding of the file and transferring to a hashMap. My problem is that i'll load in one file into a specific hashMap (say hashMap x). But when I got to load different data into ex. hasMap y, hashMap x gets re-written with all the stuff hashMap y is supposed to have. I later found out that some keys and values were getting through from different files, but some were getting erased because hashMap doesn't allow duplicates. So my end result for x and y is a mashup of data from both files into one hashMap. I dont know how to fix this.
Here is what I have so far:
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
public class CodeFileProcessor {
private static Map<String, String> codeMap = new HashMap<String, String>();
public static Map<String, String> readCodeFile(String fileName) throws IOException {
codeMap.clear();
BufferedReader br = new BufferedReader(new FileReader(fileName));
String nextLine = br.readLine();
while(nextLine != null) {
String[] parts = nextLine.split(",");
codeMap.put(parts[0], parts[1]);
nextLine = br.readLine();
}
br.close();
return codeMap;
}
}