2

Last night I was toying a bit with the performance difference between managed and unmanaged code. Here I am refering to C# and C++. To put it simply I was pretty shocked with the difference, and wanted to share this experience and perhaps get enlightened if I am doing something wrong.

My testing application was calculating up to a million primenumbers. My testing machine was a Intel i-7 920, overclocked to 3.81Ghz.

Disclaimer: I didn't do any algorithm optimization of any sort. The algorithm is taken from a previous stackoverflow topic. The code is 100% identical, and I have not taken any parallel advantage on purpose, to make the test equal. Please taken in account I am not a C++ programmer, it is my first time playing with C++. I mainly only do C# and Java. I wrote the C++ part as a DLL file, which I will invoke in C# with Pinvoke.

C++:

#include <stdio.h>

extern "C"
{
    __declspec(dllexport) long FindPrimeNumbers(int n)
    {
        int count = 0;
        long a = 2;
        while(count < n)
        {
            long b = 2;
            int prime = 1;
            while(b * b <= a)
            {
                if(a % b == 0)
                {
                    prime = 0;
                    break;
                }
                b++;
            }
            if(prime > 0)
                count++;
            a++;
        }
        return (--a);
    }
}

C#:

static long PrimeNumbers(int n)
        {
            int count = 0;
            long a = 2;
            while (count < n)
            {
                long b = 2;
                int prime = 1;
                while (b * b <= a)
                {
                    if (a % b == 0)
                    {
                        prime = 0;
                        break;
                    }
                    b++;
                }
                if (prime > 0)
                    count++;
                a++;
            }
            return (--a);
        }

Please remember, I will compile the C++ method in C#.

[DllImport("myDLL.dll", CallingConvention = CallingConvention.Cdecl)]
public static extern long FindPrimeNumbers(int n);


        static void Main(string[] args)
        {
            const int numbersToFind = 1000000;
            Stopwatch sw = new Stopwatch();
            sw.Start();
            PrimeNumbers(numbersToFind); //C#
            sw.Stop();
            Console.WriteLine(sw.Elapsed);
            sw.Restart();
            FindPrimeNumbers(numbersToFind); //C++ Pinvoke
            sw.Stop();
            Console.WriteLine(sw.Elapsed);
            Console.ReadLine();
        }

There is a HUGE difference in the result, between these two codes:

C# = 00.00.32.6975885
C++ = 00.00.10.5427109

I know there can be made optimization for the algorithm such as using the parallel for loop provided by the .NET, but this is not the point. I just want to know, if this is a normal performance behavior between unmanaged and managed code? Or that I have maybe skipped something? I tried to search on Google yesterday, about the performance difference between C# and C++ and there was so many topics where people could swear by their graves, that the performance between these two languages are identical.

Daniel
  • 29
  • 1

0 Answers0