I've recently come across a while statement that uses 1 == 1 instead of true.
Example:
while (1 == 1)
{
// Do something
}
Instead of:
while (true)
{
// Do something
}
They both appear to be correct and generate the same result but I wanted to know (apart from why a developer would use 1 == 1 instead of true - style/habit aside) what impact this has from a compiler perspective, is there a greater overhead in using the comparison operator instead of true?