E.g.
for (int i = 0; i < 10; ++i)
...
vs
for (int i = 0; i < 10; i++) ...
I have been told ++i would be more efficient. Are there differences in other languages such as JavaScript or is it a more fundamental rule?
E.g.
for (int i = 0; i < 10; ++i)
...
vs
for (int i = 0; i < 10; i++) ...
I have been told ++i would be more efficient. Are there differences in other languages such as JavaScript or is it a more fundamental rule?
The difference between i++ and ++i is the value of the expression.
The value i++ is the value of i before the increment. The value of ++i is the value of i after the increment.
However in your loop it does not make any difference.
int i = 0;
00000088 xor edx,edx
0000008a mov dword ptr [ebp-40h],edx
i++;
0000008d inc dword ptr [ebp-40h]
++i;
00000090 inc dword ptr [ebp-40h]
for(int i=0;i<10;i++)
{
//some jobs
}
is the same as:
int i=0;
while(i<10)
{
//some jobs
i++;
}
and
for(int i=0;i<10;i++)
{
//some jobs
}
equals to:
int i=0;
while(i<10)
{
//some jobs
++i;
}
so no difference here, performance will be the same
i++ is postincrementation and ++i is preincrementation, so there is a difference in other cases but not performance. You can read more about pre and post incrementation here C# Pre- & Post Increment confusions
The advantage of using ++i in some languages is that it does not create a temporary unnecessarily (see e.g. Exceptional C++ for a thorough explanation for C++).
++i
is no more efficient than i++
in C# if you are not assigning the result in the same expression. The same code is generated for both cases.
Of course, if you assign the result to something in the same expression, there will be a difference in the code generated - but in that event, the semantics are different and you don't have a choice of which form of increment to use.
So use whichever you like best.