I would like to process a text file (about 400 MB) in order to create a recursive parent-child-structure from the data given in each line. The data have to be prepared for a top down navigation (input: parent, output: all children and sub children). E.g. of lines to be read: (child,id1,id2,parent,id3)
132142086;1;2;132528589;132528599
132142087;1;3;132528589;132528599
132142088;1;0;132528589;132528599
323442444;1;0;132142088;132528599
454345434;1;0;323442444;132528599
132528589: is parent of 132142086,132142087,132142088
132142088: is parent of 323442444
323442444: is parent of 454345434
Given: OS windows xp, 32bit, 2GB available Memory and -Xmx1024m
Here is the way I prepare the data:
HashMap<String,ArrayList<String>> hMap=new HashMap<String,ArrayList<String>>();
while ((myReader = bReader.readLine()) != null)
{
String [] tmpObj=myReader.split(delimiter);
String valuesArrayS=tmpObj[0]+";"+tmpObj[1]+";"+tmpObj[2]+";"+tmpObj[3]+";"+tmpObj[4];
ArrayList<String> valuesArray=new ArrayList<String>();
//case of same key
if(hMap.containsKey(tmpObj[3]))
{
valuesArray=(ArrayList<String>)(hMap.get(tmpObj[3])).clone();
}
valuesArray.add(valuesArrayS);
hMap.put(tmpObj[3],valuesArray);
tmpObj=null;
valuesArray=null;
}
return hMap;
After then I use a recursive function:
HashMap<String,ArrayList<String>> getChildren(input parent)
for creating the data structure needed. The plan is to let the hMap available (read only) for more than one thread using the function getChildren.
I tested this program with an input file of 90 MB and it seemed to work properly. However, running it with the real file with more than 380 MB lead to:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
I need some help in memory resource management