I have a large file (something like 3GB) and read into a ArrayList When i run the code below, after several minutes code run very slowly and CPU usage is high. after several minutes eclipse console show Error java.lang.OutOfMemoryError: GC overhead limit exceeded.
- OS:windows2008R2,
- 4 cup,
- 32GB memory
- java version "1.7.0_60"
eclipse.ini
-startup
plugins/org.eclipse.equinox.launcher_1.3.0.v20130327-1440.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.win32.win32.x86_64_1.1.200.v20140116-2212
-product
org.eclipse.epp.package.standard.product
--launcher.defaultAction
openFile
#--launcher.XXMaxPermSize
#256M
-showsplash
org.eclipse.platform
#--launcher.XXMaxPermSize
#256m
--launcher.defaultAction
openFile
--launcher.appendVmargs
-vmargs
-Dosgi.requiredJavaVersion=1.6
-Xms10G
-Xmx10G
-XX:+UseParallelGC
-XX:ParallelGCThreads=24
-XX:MaxGCPauseMillis=1000
-XX:+UseAdaptiveSizePolicy
java code:
BufferedInputStream bis = new BufferedInputStream(new FileInputStream(new File("/words/wordlist.dat")));
InputStreamReader isr = new InputStreamReader(bis,"utf-8");
BufferedReader in = new BufferedReader(isr,1024*1024*512);
String strTemp = null;
long ind = 0;
while (((strTemp = in.readLine()) != null))
{
matcher.reset(strTemp);
if(strTemp.contains("$"))
{
al.add(strTemp);
strTemp = null;
}
ind = ind + 1;
if(ind%100000==0)
{
System.out.println(ind+" 100,000 +");
}
}
in.close();
my use case :
neural network
java
oracle
solaris
quick sort
apple
green fluorescent protein
acm
trs