I read a dictionary that might be 100MB or so in size (sometimes gets bigger up to max 500MB). It is a simple dictionary of two columns, the first column words the second column a float value. I read the dictionary file
it in this way:
BufferedReader br = new BufferedReader(new FileReader(file));
String line;
while((line = br.readLine()) != null) {
String[] cols = line.split("\t");
setIt(cols[0], cols[1]);
and for the setIt
function:
public void setIt(String term, String value) {
all.put(term, new Double(value));
}
When I have a big file, it takes a long time to load it and it often goes out of memory. Even with a reasonable size file (100MB) it does need a 4GB memory in Java to be run.
Any clue how to improve it while not changing the structure of the whole package?
EDIT: I'm using a 50MB file with -Xmx1g
and I still get the error.
UPDATE: There were some iterations over the file that I fixed them and now the memory problem was partially solved. Yet to try the properties and other solutions and report on that.