0

I see that that the Active Memory on our Linux machine which hosts the JVM is somewhat increasing. The memory used by Java process is more or less the same at 6.1G.

21442 nobody    20   0 13.1g 6.1g  64m S 33.9 39.2   1677:39 java

I wanted to know what all could be the cause of the issue and it would be helpful if some advice is given on how to debug this issue.

Sample of /proc/meminfo on successive days measured at same time are as below

Day1

MemTotal:       16434272 kB
MemFree:         2066644 kB
MemAvailable:    8221744 kB
Buffers:          364212 kB
Cached:          5487632 kB
SwapCached:            0 kB
Active:          9836280 kB
Inactive:        3899908 kB
Active(anon):    7884360 kB
Inactive(anon):      120 kB
Active(file):    1951920 kB
Inactive(file):  3899788 kB
Unevictable:           0 kB
Mlocked:               0 kB
SwapTotal:             0 kB
SwapFree:              0 kB
Dirty:              7164 kB
Writeback:             0 kB
AnonPages:       7884404 kB
Mapped:           185276 kB
Shmem:               128 kB
Slab:             514228 kB
SReclaimable:     482620 kB
SUnreclaim:        31608 kB
KernelStack:        9248 kB
PageTables:        26988 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:     8217136 kB
Committed_AS:   13542340 kB
VmallocTotal:   34359738367 kB
VmallocUsed:           0 kB
VmallocChunk:          0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:       12288 kB
DirectMap2M:     2084864 kB
DirectMap1G:    15728640 kB

Day2

MemTotal:       16434272 kB
MemFree:         1211928 kB
MemAvailable:    7985664 kB
Buffers:          365508 kB
Cached:          6099704 kB
SwapCached:            0 kB
Active:         10566764 kB
Inactive:        4018780 kB
Active(anon):    8120344 kB
Inactive(anon):      120 kB
Active(file):    2446420 kB
Inactive(file):  4018660 kB
Unevictable:           0 kB
Mlocked:               0 kB
SwapTotal:             0 kB
SwapFree:              0 kB
Dirty:              9504 kB
Writeback:             0 kB
AnonPages:       8120352 kB
Mapped:           182616 kB
Shmem:               128 kB
Slab:             519364 kB
SReclaimable:     487884 kB
SUnreclaim:        31480 kB
KernelStack:        9184 kB
PageTables:        26920 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:     8217136 kB
Committed_AS:   13320036 kB
VmallocTotal:   34359738367 kB
VmallocUsed:           0 kB
VmallocChunk:          0 kB
HugePages_Total:       0
HugePages_Free:        0 
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:       12288 kB
DirectMap2M:     2084864 kB
DirectMap1G:    15728640 kB

Day3

MemTotal:       16434272 kB 
MemFree:         1136836 kB
MemAvailable:    7668800 kB
Buffers:          365820 kB
Cached:          5866988 kB
SwapCached:            0 kB
Active:         11092372 kB
Inactive:        3576692 kB
Active(anon):    8436276 kB
Inactive(anon):      120 kB
Active(file):    2656096 kB
Inactive(file):  3576572 kB
Unevictable:           0 kB
Mlocked:               0 kB 
SwapTotal:             0 kB
SwapFree:              0 kB
Dirty:              4940 kB
Writeback:             0 kB
AnonPages:       8436320 kB
Mapped:           187980 kB
Shmem:               128 kB
Slab:             509852 kB
SReclaimable:     478524 kB
SUnreclaim:        31328 kB
KernelStack:        9296 kB
PageTables:        27992 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:     8217136 kB
Committed_AS:   13516284 kB
VmallocTotal:   34359738367 kB
VmallocUsed:           0 kB
VmallocChunk:          0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:       12288 kB
DirectMap2M:     2084864 kB
DirectMap1G:    15728640 kB
AksTester
  • 121
  • 1
  • 9

2 Answers2

2

If the memory usage of the JVM system process increases over time, that does not necessarily mean that you have a memory leak. The JVM has two parameters: one for the initial amount of system memory to allocate (-Xms) and one for the maximum amount allowed to allocate (-Xmx). If not explicitly specified, the JVM chooses a value according to the specs of the system its running on.

When the running java application requires more heap space than fits into the currently allocated system memory, the JVM will gradually allocate more system memory (up to the specified maximum). Once allocated, the JVM may not free that system memory again, even if the amount of allocated heap space drops (for example when a garbage collection is performed). This behaviour may vary depending on the actual implementation of the JVM.

If you see that your JVM system process peaks at 6.1 GB, this may be the configured maximum amount.

Seeing OutOfMemoryErrors in your log file would be a hint that there might be a leak. If you want to know what's going on inside your JVM memory, you can create a heap dump file and analyze with a tool like The Eclipse Memory Analyzer. You can add the -XX:+HeapDumpOnOutOfMemoryError command line parameter to your JVM to have it automatically create a heap dump file when an OutOfMemoryError occurs.

Jack
  • 2,937
  • 5
  • 34
  • 44
0

Active memory simply means "pages that have been accessed recently" and isn't a terribly interesting measure on its own (see https://unix.stackexchange.com/questions/305606/linux-inactive-memory for more information)

It's much more useful to track the RSS (as reported by ps et al) and then JVM-specific portions of the memory like the heap, thread stacks, code cache, metaspace, etc - for a thorough overview I recommend these excellent resources:

Juraj Martinka
  • 3,991
  • 2
  • 23
  • 25