It has been a long time since I last programmed at the bits and bytes level and wanted to confirm something I seem to remember from those days:
Say I have two integers of equal length (1, 2, 4, 8 bytes; it doesn't matter), and I add them up: does the bit-by-bit result of the sum differ if they are signed or unsigned. In other words: regardless of whether they are signed or unsigned integers, will the bits end up being the same?
My intuition and my frail memory tell me they will, but I just wanted to confirm. Thanks.