I came across this excerpt today:
On most older microprocessors, bitwise operations are slightly faster than addition and subtraction operations and usually significantly faster than multiplication and division operations. On modern architectures, this is not the case: bitwise operations are generally the same speed as addition (though still faster than multiplication).
I'm curious about why bitwise operations were slightly faster than addition/subtraction operations on older microprocessors.
All I can think of that would cause the latency is that the circuits to implement addition/subtraction depend on several levels of logic gates (parallel adders and whatnot), whereas the bitwise operations have far simpler circuit implementations. Is this the reason?
I know arithmetic and bitwise operations both execute within one clock-cyle on modern processors, but speaking purely about propagation time for the circuit, is the latency still theoretically there in modern processors?
Finally, I had a conceptual C question about the execution of the bitwise shift operation:
unsigned x = 1;
x <<= 5;
unsigned y = 0;
y += 32;
Both x
and y
should hold the value 32
, but did it take 5 separate left shifts to get x
to that value (as in are bitwise shifts implemented via pipes)? To clarify, I'm asking purely about the circuit behavior not the number of clock cycles.