An int32 is represented in computer memory with a size of 4 bytes (32 bits).
So, 32 bits have 1 sign bit and 31 data bits. But if 1st bit starts at 2^0, then the 31st bit should have 2^30, and the last bit is of course the sign bit.
How is it then that integer extends from -2^31 to (2^31)-1?