1

We know in order to get the lowest set bit of a number we do X &= -X . That is in fact a condensed form of X &= ~(X - 1) . What I don't get here is how ~(X - 1) == -X ?

I can get to the answer using a truth table easily but I'm looking for a Boolean Algebra type of proof.

Sankalp
  • 1,182
  • 2
  • 17
  • 22
  • Is X already masked ? – SPlatten Feb 09 '18 at 12:10
  • Write down the bits on a piece of paper. For `X`, for `-X`, for `X - 1`, and for `~(X - 1)`. – Some programmer dude Feb 09 '18 at 12:10
  • 1
    That doesn't get the LSB, it gets the lowest set bit. The LSB would be `X & 1`. – Jeroen Mostert Feb 09 '18 at 12:25
  • What do you allow as starting assumptions? There are various proofs starting with the *other* definition of negation, definitions of subtractions, a particular implementation of addition, and the fact that "all bits set"+1 = 0. But it has to start somewhere, there is no purely algebraic reason for that particular definition of negation (if there was, then negation would only exist for bitvectors - it's nice that it exists for other objects too) – harold Feb 09 '18 at 23:50

0 Answers0