4

I have a simple grpc server in golang which does CRUD operations on an object. However, when I run it the memory never goes down even after requests stop. pprof of heap show has the following result:

> flat  flat%   sum%        cum   cum%
>   932.39MB 62.45% 62.45%   932.39MB 62.45%  google.golang.org/grpc/internal/transport.newBufWriter
>   463.13MB 31.02% 93.46%   463.13MB 31.02%  bufio.NewReaderSize
>    13.50MB   0.9% 94.37%    13.50MB   0.9%  runtime.malg
>       13MB  0.87% 95.24%  1420.52MB 95.14%  google.golang.org/grpc/internal/transport.newHTTP2Server
>       11MB  0.74% 95.98%    12.10MB  0.81%  time.NewTimer
>     8.50MB  0.57% 96.54%     8.50MB  0.57%  golang.org/x/net/http2/hpack.(*headerFieldTable).addEntry
>     5.50MB  0.37% 96.91%    17.60MB  1.18%  google.golang.org/grpc/internal/transport.(*http2Server).keepalive
>     3.50MB  0.23% 97.15%     7.50MB   0.5%  google.golang.org/grpc/internal/transport.newLoopyWriter
>     1.50MB   0.1% 97.25%    12.50MB  0.84%  google.golang.org/grpc.(*Server).serveStreams
>          0     0% 97.25%       10MB  0.67%  golang.org/x/net/http2.(*Framer).ReadFrame

Can anyone guide me on how to go about fixing this memory issue? The server runs with default options and I have even enabled debug.FreeOSMemory() function to release memory.

Anshul Prakash
  • 301
  • 2
  • 15
  • As far as i know, pprof collects the profile and then shows the stats, its not realtime once samples are collected and dumped in a profile. Can you share the command you have used here for profiling. – majin Feb 07 '20 at 08:03
  • go tool pprof http://localhost:7777/debug/pprof/heap Also, I ran the command at regular intervals to see the change in profile. – Anshul Prakash Feb 07 '20 at 13:20
  • Those are showing sampled allocations, not what is currently holding memory. Also note that if there's no memory pressure, the OS may not reclaim memory even if the program advises that it's free. – JimB Feb 07 '20 at 20:29

1 Answers1

3

Most likely you need to close ClientConn, or reuse it. I encountered the same issue and the problem was creating new ClientConn for each RPC call without closing these connection later.

Great explanation here pooling-grpc-connections

Related question GRPC Connection Management in Golang