I understand that an sbyte
foregoes a bit in order to "sign" it's value, positive or negative. This naturally leaves 7 bits for the value being 72 = 128 (0 to 127 OR -1 to -128 dependent on the signing bit).
My question is, which of the 8 bits in the byte is the signing bit? I imagine it to be bit 8, the MSB?
I ask since I would of expected byte foo = -1
to explicitly cast to 129
with byte bar = (byte)foo
since the underlying bit structure would be 10000001
. However, the above returns bar
as 255.
Test Code
sbyte negative = -1;
sbyte postive = 1;
byte negativeResult = (byte)(negative & 0b_1000_0001);
byte positiveResult = (byte)(postive & 0b_1000_0001);
Console.WriteLine($"Negative evaluated to: {Convert.ToString(negativeResult, toBase: 2)}");
Console.WriteLine($"Positive evaluated to: {Convert.ToString(positiveResult, toBase: 2)}");
Output
Negative evaluated to: 10000001
Positive evaluated to: 1