I'm running a Java program on my Core i7 laptop which has 8 cores (4 physical, 4 HT). The program uses 8 parallel threads and so it should use up all the CPU. When running with the '-server' parameter, it is at 100% all the time. Without it, it's around 50%-60% overall (always changing with peaks at 100% and dips at 30%). Here's what I find weird: when I run the program in debug and wait for a moment where CPU usage is especially low (30%) and then suspend execution to look at what the eight threads are doing, none of them are in a blocking state. Furthermore, there's are almost no synchronization between them. Here's what I'm wondering:
- What's the difference between server and client VM that would prevent the CPU to reach 100% in client?
- In the absence of synchronization, what could be keeping a thread from using up a core fully? (probably linked to 1)
Edit: Here's a thought: the code allocates big arrays and leaves them to GC pretty quickly. Does a thread sleep when calling 'new SomethingBig()' and allocating that memory takes time? If there is a single process in the VM handling allocations for a bunch of threads, I guess that could explain why they seem to pause at random, outside of synchronization blocks...
Edit2: I'm pretty sure it is caused by GC now. The CPU reaches 100% again if I give the VM 1500Mb instead of the default 500Mb. I think the slowdown doesn't happen in server mode because it uses more memory by default.