I recently programmed a small application and there I needed to divide many, many numbers by 2. At first I just used the line i/2;
but a friend of mine said i could just use the operator i>>=1
which produces the same numbers. Just for fun we meassured the time it took to compute and we found out that the i>>=1 method was on average 15% (30ms improvement) faster. (i
is an int)
Why is that so? I just changed one operater nothing else.