It actually depends on the host system on which the program is run, and the compiler.
Some compilers (particularly with optimisation settings enabled) are smart enough to recognize that that the memory allocated in the loop is not used in a way that affects program output. The compiler may then omit the memory allocation and may then (since an infinite loop that does nothing actually yields undefined behaviour in C++) do anything. Possible outcomes then include the program looping forever (consuming CPU cycles but not allocating memory), the compiler not generating the loop at all (so the program immediately terminates).
If compiler optimisation is not in play (e.g. a non-optimised build) the program may loop and repeatedly allocate memory. What happens then depends on the operating system.
Some operating systems - such as unix variants - can be configured (or are configured by default) to do lazy allocation or over-committing, in the sense they won't actually allocate virtual or physical memory until the program tries to non-trivially use that memory (e.g. stores data in it). Since your loop never uses the memory it allocates, on a system doing lazy allocation, the loop may run indefinitely - with no observable effect on the host system, other than lost CPU cycles.
If the OS is not configured to do lazy allocation (e.g. actually allocates the memory immediately) the loop will continue until allocation fails and terminate the program by throwing an exception.
What happens during that process depends, again, on the operating system.
If the operating system imposes a limited quota to the process that is significantly less than the total memory (physical or virtual) available to the operating system, then other processes (and the OS itself) may be unaffected by our program.
If the operating system allows the program to allocate all available memory, then the repeated allocation in this program may affect the behaviour of other programs, or the operating system itself. If the operating system is able to detect that, it may forceably terminate the program - and, in that process, it may reclaim the allocated memory. If the operating system does not detect that behaviour and terminate the program, the program may cause slow-down of the operating system.
If your operating system itself is hosted (e.g. in a hypervisor environment) then the operating system itself may be terminated, but other instances of operating systems running on the same physical machine will normally be unaffected.
It has been at least two decades since a novice programmer could unleash such a program on an unsuspecting machine, and be even a little confident that the machine will eventually slow down and possibly crash, or need a hard power cycle. Operating system design, and larger system designs (e.g. routine use of hypervisors) make such pranks less likely to have any enduring effect.