As I am calling ummanaged dll from C#, I've gone through some testing about the performance of for-loop in C# and C...
The result amazed me in the way that as the loop goes over bigger range, the performance of C# decreases as compared to C..For smaller range,C# shown well performance over C....But, as upper range of for-loop increases, the C# performance degrades as compare to C....
Here is my testing code....
[DllImport("Testing.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern int SumLoop(int lowLimit, int highLimit);
public static void Main(string[] args)
{
const int LowerRange = 1;
const int HigherRange = 1000000;
// Test with C# For Loop
var watch1 = new Stopwatch();
watch1.Start();
int sum = 0;
for (int i = LowerRange; i <= HigherRange; i++)
{
sum += i;
}
watch1.Stop();
long elapseTime1 = watch1.ElapsedMilliseconds;
// Test with C-for loop
var watch2 = new Stopwatch();
watch2.Start();
int sumFromC = SumLoop(LowerRange , HigherRange);
long elapseTime2 = watch2.ElapsedMilliseconds;
}
Testing.dll:
__declspec(dllexport) int SumLoop(int lowLimit, int highLimit)
{
int idx;
int totalSum = 0;
for(idx = lowLimit;idx<= highLimit; idx= idx +1)
{
totalSum += idx;
}
return totalSum;
}
Testing Result :
Testing 1 :
HigherRange : 1000000
C# Loop : 4 millisecond
C-loop : 9 millisecond
Testing 2 :
HigherRange : 10000000
C# Loop : 53 millisecond
C-loop : 36 millisecond
Testing 3 :
HigherRange : 100000000
C# Loop : 418 millisecond
C-loop : 343 millisecond
Here, I started above testing with the aim that C for-loop performance will be better than C# loop but it goes exactly opposite with my understanding and gone with this question and agreed...But when I increase the upper range of for loop, the C performance goes well as compared to C#...
Now,I'm thinking that is the appraoch of testing is wrong or is it the expected performance result?