I'm building a custom hash where I sum all the letters in string according to formula:
string[0] * 65536 + string[1] * 32768 + string[2] * 16384 + ...
And I've came to a problem whether I should have these numbers defined as constants in int array like this:
const int MULTIPLICATION[] = {
65536,
32768,
16384,
8192,
4096,
2048,
1024,
512,
256,
128,
64,
32,
16,
8,
4,
2,
1
}
Or, maybe I should just generate these numbers while counting hash itself (while probably losing some speed due to them not being generated already)? I'll need to count this hash millions of times and the main thing I want compiler to understand is that instead of normal MUL operation
MOV EBX, 8
MUL EBX
it would do
SHL EAX, 3
Does compiler understand that if I'm multiplying by power of 2 to shift bits instead of usual multiplication?
Another question, I'm pretty sure it does shift bits when you write in c++ number *= 2; But just to clarify, does it?
Thanks, I've found out how to view dissasembly in debugger. Yes, compiler does understand to shift bits if you use it like
number *= 65536
However, it does normal multiplication if you do
number1 = 65536
number *= number1;