I'm attempting to time a function using the gettimeofday()
function. I'm able to acquire the time in microseconds, but when attempting to divide it by a long
to get time in seconds my value is truncated to a single integer
. Does anyone have any insights as to why this is happening? I had assumed that division by a long = 1000000.0
would prevent this kind of truncation.
Timing of function:
struct timeval t1,t2;
gettimeofday(&t1, NULL);
// Computes C - C1 - using single thread
for (i=0; i < n; i++)
for (j=0; j < p; j++)
{
C1[i][j]=0;
for (k=0; k < m; k++)
C1[i][j] += A[i][k]*B[k][j];
}
gettimeofday(&t2, NULL);
Division applied here:
long divider = 1000000.0;
long elapsed = ((t2.tv_sec - t1.tv_sec) * 1000000.0L) + (t2.tv_usec - t1.tv_usec);
elapsed = (elapsed/divider);
printf("Time in seconds: %ld seconds\n", elapsed);
Any help is appreciated.