UInt62.max / 2 is represented by 0100..0000 in the memory. Add 1 and it will be 0100..0001. So, the first bit for the sign. And we take -1. But CPU thinks that it's -9 223 372 036 854 775 808. Why does it work so complexly?
You can see that it's truth because of the issue in the Swift playground: Why is UInt64 max equal -1 in Swift?
var max = UInt64.max / 2 + 1 // playground shows -1 because it treats it as Int64