3

as far as I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.

Is there a reliable way to compute on the JVM the correct deep size of an object?

The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.

Note: as far as possible, I would like an enterprise-ready solution, so either a "standard" coding practice or a well tested library

Vincenzo Maggio
  • 3,787
  • 26
  • 42
  • Can you please update your question to explain why http://stackoverflow.com/a/52682/3224483 did not work for you? It was the very first Google result for **java object memory**, which tells me that you did not look very hard. – Rainbolt Feb 03 '14 at 18:53
  • The JVM is free to allocate as much memory [as it wants](http://docs.oracle.com/javase/7/docs/technotes/guides/vm/performance-enhancements-7.html#compressedOop) and it's not going to tell you what it does internally (except external profilers maybe), calculations can be a good estimation at best. And what means deep size you are referencing an object that is also referenced by something else? – zapl Feb 03 '14 at 18:56
  • @John wow, I hope you didn't give a -1 for that! getObjectSize(), like I specified in the question, AFAIK doesn't retunrn the deep size of an object, an by the way even the documentation states it's an approximation! – Vincenzo Maggio Feb 03 '14 at 19:05
  • @VincenzoMaggio When I hovered over the down arrow, a notification popped up that said, "This question does not show any research effort." So I clicked it. Then I got a warning that said I should suggest an improvement to the question. So I did. If you are unhappy with me following the prompts, take it to meta. – Rainbolt Feb 03 '14 at 19:09
  • Well, I changed the title three times and clicked about eight other questions. And by the way, read the comment for the answer below to understand your google search was plain wrong. – Vincenzo Maggio Feb 03 '14 at 19:10
  • @VincenzoMaggio The accepted answer for those eight other questions was "the best we can get is an approximation using X, Y, and Z tools or methods". What makes you think that asking again will get you a different answer? Has your research shown that a better method does exist? Do you disagree with the other answers? In what way is your question different? – Rainbolt Feb 03 '14 at 19:13
  • 1
    Because I've seen Java EE servers with fized size cache. Just that. "Has your research shown that a better method does exist?" so no more questions because I don't think actual questions answer my doubt?! By the way, if one question was enough, why there are so many similar questions on the topic? Lazy reviewers?! – Vincenzo Maggio Feb 03 '14 at 19:17

1 Answers1

0

I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.

With Instrumentation alone, no.

With instrumentation and a knowledge of how the memory of a particular JVM is laid out will give your number of bytes used. It won't tell you how other JVMs might work and it doesn't tell you how much data is shared.

Is there a reliable way to compute on the JVM the correct deep size of an object?

I use a profiler, but unless you believe some of the tools you use you can never know.

The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.

How slow are you willing to make your cache for precise memory usage? If it is 10x or 100x slower but has very accurate usage is this better than something which just counts the number of elements?

so either a "standard" coding practice or a well tested library

In that case, use the element count. You can use LinkedHashMap or ehcache for this.

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
  • Hello, so the answer is: you cannot upper bound the memory size of a data structure? About how do I know: http://stackoverflow.com/questions/52353/in-java-what-is-the-best-way-to-determine-the-size-of-an-object – Vincenzo Maggio Feb 03 '14 at 19:09
  • 1
    @downvoter, care to comment why? – Peter Lawrey Feb 03 '14 at 19:12
  • 1
    @VincenzoMaggio you asked if you can determine the deep size of a data structure and you can do that using Instrumentation. The comments about it being approximate are based on trying to determine how much the data structure adds and you can't determine that easily without examining all the memory. e.g. a profiler can do this too. – Peter Lawrey Feb 03 '14 at 19:14
  • @VincenzoMaggio Given the accepted answer is; use Instrumentation, you will need to give some more explanation as to why it not. – Peter Lawrey Feb 03 '14 at 19:19
  • mmmm I'm thinking the simplest way would be to fire another jvm, this way I could simply set the upper bound as the jvm memory – Vincenzo Maggio Feb 03 '14 at 19:24