Because it makes life easier if you visualize working with the bits
An int
is 4 bytes big in memory, so to have that as four individual byte
s we can shift it around
some int: 11111111 10101010 11110000 11001100
To get these blocks of bits and 4 individual bytes we could sure divide by powers of 2, but it's harder to visualize than just sliding the whole thing to the right 24 places leaving 11111111
that we store in a byte (everything that slides off the right hand side is lost).. or sliding to the right 16 places leaving 11111111 10101010
that then cuts the 11111111
off when we store it in a byte (everything to the left of the rightmost 8 bits is lost when assigning to a byte)
It's somewhat like playing Stacker with bits
As to why you might use left shift to recompose an int from 4 bytes - imagine playing a reverse version of stacker where you have 32 slots arranged into 4 groups and you have to put these 8 bits (from the leading byte) into the leftmost bucket, then the bits of the subsequent bytes into the next bucket:
some byte: 11111111
next byte: 10101010
some int: ________ ________ ________ ________
The "some byte" is going to need to be slid left 24 places, the next byte 16 places. They acquire 0s (become ints with only some bits set) as a result:
some byte shifted: 11111111 00000000 00000000 00000000
next byte shifted: 10101010 00000000 00000000
some int result: ________ ________ ________ ________
These are then bitwise ORred together to produce the result:
some byte: 11111111 00000000 00000000 00000000
next byte: 00000000 10101010 00000000 00000000
some int: 11111111 10101010 00000000 00000000
OR is "work column by column, if any value in the column is a 1, the resulting value in the column is 1, otherwise it is 0"
Why are these things always done as bytes?
Because that's how network transmission is, because everything, ultimately is a byte (an int is 4 bytes), or bits if you want to look at it that way. Even if you use some abstraction that writes an int to a socket, it'll convert it to bytes. Here you're just gaining an appreciation for how that conversion works out (and it doesn't have to be this way; there are other ways of arranging bits and bytes. So long as you're consistent it doesn't matter how you do it)
You also see it used for things like flags enums:
flags enum Perms{
None = 0,
Read = 1 << 0,
Write = 1 << 1,
Delete = 1 << 2
}
Which could also be done by
flags enum Perms{
None = 0,
Read = 2^0,
Write = 2^1,
Delete = 2^2
}
No-one will fire you for the bitshift version, though as operations they're less commonly encountered than powers so the next person that maintains the code might have to look up what it means whereas the power form is probably already well understood
As to how they came to be, bitshifting operations are/were also typically a lot faster than multiply/divide operations (they're very simple for a CPU to implement but have limited application) so were very useful for limited contexts