Consider a very short program where I am allocating a little bit of memory. I have been taught that GC runs in circumstances when a program allocates a lot of memory and the allocation reaches a limit.
I don't know what that limit is exactly but I think it must be high enough so that GC doesn't run frequently and slows down the execution of the program.
My question is, what happens when the allocation doesn't reach the level at which GC prepares to run, during the lifetime of a program. Does it results in a memory leak?