1

Hello I am using jdk7 and ram of 6GB with intel corei5 processor.I have a java code which has an arraylist of size more than 6000 and each element in that arraylist contains 12 double values.The processing speed has decreased very much and now it takes around 20 mins to run that entire code.

What the code does is as follows:

There are some 4500 iterations happening due to nested for loops..and in each iteration a file of 400 kb is read and some processing happens and some values are stored in arraylist.

Once the arraylist is ready the values of arraylist are written in another file through csvwriter.and then i have used jtable and the jtable also required the arraylist for referring to some values in that arraylist.so basically i cant clear this arraylist.

I have given the values for heap memory as follows

-Xms1024M -Xmx4096M

I am new to programming and I am rather confused as to what should i do?Can i increase heap size more than this?will that help?My senior suggests to use hard disk memory for storing arraylist elements or processing but i doubt if that is possible. Please help.Any help will be appreciated

Catchwa
  • 5,845
  • 4
  • 31
  • 57
  • 6
    It doesn't sound like a memory problem, since `8*12*6000 = 576000` is less than 1 MB. Sounds more like you have an algorithmic problem. If you update your question with more details about what processing you are doing, we may be able to help better. – merlin2011 May 30 '14 at 05:19
  • 1
    I doubt it is a memory issue. Try adding some profiling information (or even use a profiler). – Scary Wombat May 30 '14 at 05:20
  • 1
    4500 iterations...nested for loops...let me assure you, those two red flags are *far* more profitable spots for optimization than tweaking the maximum size of the heap. – Makoto May 30 '14 at 05:20
  • i said heap size was an issue because i had that OutOfMemoryError twice after which i increased the memory to 4096M – user3176258 May 30 '14 at 05:22
  • 1
    Then you are allocating memory somewhere else and not telling us. :) – merlin2011 May 30 '14 at 05:22
  • "so basically i cant clear this arraylist." How can you *not* clear it? – David Ehrmann May 30 '14 at 05:27
  • I cant clear it because as i said i have a jtable which constantly requires referring to arraylist – user3176258 May 30 '14 at 05:37

3 Answers3

2

Heap size isn't your issue. When you have one of those, you'll see an OutOfMemoryError.

Usually what you do when you encounter performance issues like this is you profile, either with something like VisualVM or by hand, using System.nanoTime() to track which part of your code is the bottleneck. From there, you make sure you're using appropriate data structures, algorithms, etc., and then see where you can parallelize your code.

David Ehrmann
  • 7,366
  • 2
  • 31
  • 40
  • i said heap size was an issue because i had that OutOfMemoryError twice after which i increased the memory to 4096M – user3176258 May 30 '14 at 05:22
2

12 x 8 x 6000 doubles are not going to take up a significant amount of memory

If your program's speed is getting slower each time until it eventually crashes with an OutOfMemoryError, then it's possible that you have a coding error that is causing a memory leak.

This question has some examples of memory leak causes in Java.

Using VisualVM or some manual logging will help to identify the issue. Static code anaylsers like FindBugs or PMD may also help.

Community
  • 1
  • 1
Catchwa
  • 5,845
  • 4
  • 31
  • 57
0

I guess you're leaking the JTables somehow. This can easily happen with Listeners, TableSorters, etc. A proper tool would tell you, but the better way is IMHO to decompose the problem.

Either it's the GUI part what makes troubles or not. Ideally, the remaining program should be completely independent of the GUI, so you can run it in isolation and you'll see what happens.

My senior suggests to use hard disk memory for storing arraylist elements or processing but i doubt if that is possible.

Many things are possible but few make sense. If you're really storing just 12 x 8 x 6000 doubles, then it makes absolutely no sense. Even with the high overhead of Double, it's just a few megabytes.

Another idea: If all the data you need fits into memory, then you can try to read it all upfront. This ensures you're not storing multiple copies.

maaartinus
  • 44,714
  • 32
  • 161
  • 320