I've found rather poor performance running some computational code under Ubuntu on a brand new headless workstation machine I'm using for scientific computation. I noticed a difference in speed running some slightly complex code on Ubuntu versus on my old Mac laptop which I use for development. However, I've managed to distill it down to an incredibly simple example which still exhibits less than stelar improvements over my old machine:
#include <stdio.h>
#include <math.h>
int main() {
double res = 0.0;
for(int i=1; i<200000000; i++) {
res += exp((double) 100.0/i);
}
printf("%lf", res);
return(0);
}
Now the Mac is a nearly 5 year old 2.4GHz Core 2 Duo MacBook Pro running OS X 10.5 which runs this code in about 6.8 secs. However, on a brand new 3.4GHz Core i7 Dell running Ubuntu 11.10 it takes about 6.1 secs! Can someone enlighten me as to what is going on here, because it is absurd that a nearly 5 year old laptop is within 10% of a brand new desktop workstation? It is even more absurd because I can see the Core i7 turbo-boosting to nearly 4GHz with monitoring tools!
Mac compiled with:
gcc -o test test.c -std=gnu99 -arch x86_64 -O2
Ubuntu compiled with:
gcc -o test test.c -std=gnu99 -m64 -O2 -lm
Thanks,
Louis