I have a Go program that allocates lots of maps and slices. Generally a lot of usage, allocation overhead etc. I run it, it loads a lot of data in, and then I query it with a web service.
After I leave it running, when it's read in all its data and isn't doing any queries (i.e should be stable) I see memory fluctuations. Recently it's reported: 5.42 GB, 5.01 GB and 4.3 GB of real memory. That's a massive fluctuation.
I have about 150 million objects (slices hanging off the main hashtable). That's a lot of little objects. I expect a little fluctuation (although I would never expect memory to increase when no new objects are being allocated and the main thread/s is blocking on a socket).
Possible explanations are
- the overhead of lots of small allocations just multiplies any natural fluctuation
- some code is allocating objects (although I can't see how)
- the Go GC is doing its own paging (?)
- I'm using Mac OS, and it's at fault somehow
Is this amount of fluctuation normal / expected?