Today I made a simple test to compare the speed between java and c - a simple loop that makes an integer "i" increment from 0 to two billion.
I really expected c-language to be faster than java. I was surprised of the outcome:
the time it takes in seconds for java: approx. 1.8 seconds
the time it takes in seconds for c: approx. 3.6 seconds.
I DO NOT think that java is a faster language at all, but I DO NOT either understand why the loop is twice as fast as c in my simple programs?
Did I made a crucial misstake in the program? Or is the compiler of MinGW badly configurated or something?
public class Jrand {
public static void main (String[] args) {
long startTime = System.currentTimeMillis();
int i;
for (i = 0; i < 2000000000; i++) {
// Do nothing!
}
long endTime = System.currentTimeMillis();
float totalTime = (endTime - startTime);
System.out.println("time: " + totalTime/1000);
}
}
THE C-PROGRAM
#include<stdio.h>
#include<stdlib.h>
#include <time.h>
int main () {
clock_t startTime;
startTime = clock();
int i;
for (i = 0; i <= 2000000000; i++) {
// Do nothing
}
clock_t endTime;
endTime = clock();
float totalTime = endTime - startTime;
printf("%f", totalTime/1000);
return 0;
}