-1

I have recently made an application which synchronises values on a remote server with a local HashMap. It works by looping through the HashMap every x milliseconds and fetching a new value from a server.

Everything is working well but when I looked at the amount of memory it was using I noticed that the garbage collector was not being very effective and fairly quickly it was using several gigabytes of memory. Increasing the number of threads it works on or decreasing how many values it has to fetch seemed to fix this problem but I was still not sure what was causing it, this prompted me to look at how the memory was being used.

I noticed that there was a large amount of FieldGetTask objects (the object that I feed to a thread pool which gets executed get the new value for a field) and performing a garbage collection had almost no impact on these numbers.

I am assuming that these objects are being stored in some queue somewhere and when there are not enough threads to process them, they build up. Am I right in thinking this? And if so, is there a way to do it so that if the pool can not execute a task at the current time, the execute method blocks until it can?

Raedwald
  • 46,613
  • 43
  • 151
  • 237
user2248702
  • 2,741
  • 7
  • 41
  • 69
  • If you're using all the threads allocated to your scheduler, I'd imagine that it would "enqueue" them in that sense. But as far as I'm aware the scheduler only uses threads for the duration that a runnable takes to complete – Rogue Apr 29 '14 at 12:24
  • See also http://stackoverflow.com/questions/2001086/how-to-make-threadpoolexecutors-submit-method-block-if-it-is-saturated – Raedwald Apr 29 '14 at 12:29
  • See also http://stackoverflow.com/questions/3446011/threadpoolexecutor-block-when-queue-is-full – Raedwald Apr 29 '14 at 12:30
  • This seems like a duplicate of the questions I linked to, but I'm not an expert on `Executor` and its implementing classes, so I can't be sure. Care to clarify how this question is *not* a duplicate of those? – Raedwald Apr 29 '14 at 12:37
  • Using lots of memory is only a problem if you don’t have it (or need it for other purposes). You didn’t say that you got an `OutOfMemoryError` so it doesn’t seem to be a *real* problem then. Maybe you should just reduce the heap of you JVM… – Holger Apr 30 '14 at 10:21
  • After about 10 minutes it is using 3-4 gigs or memory and almost all of that seems to be in FieldGetTasks. – user2248702 Apr 30 '14 at 10:37

1 Answers1

0

You can add your own Queue implementation. A simple one would sleep when the size is too large, or it could wait for a task to be consumed.

A more common solution is to put a limit on the queue size and have the caller execute the task if the queue is full. See the RejectionHandler.

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130