This program finds the distance between two points on a plane capable of floating points. But what I can't figure out (which might be due to my ignorance) is the problem with computing the dist
variable. It is set to a Pythagorean equation that returns what should be the distance. All other variables in the system are correct.
What I am mostly having a problem though is an obvious bug with the sqrt()
in it's use in this example. For example, if I were to make xdist = 1
with everything else set to 0, the distance should be 1. However, what is printed as dist
is exactly 1072693248. I only found one question related to this number. But what was mentioned in the answers was that only the wrong use of %d
in printf
caused that issue, but I see (to what I know) there isn't a problem with printf
and formatting anywhere in the program. What is the solution to get the correct value?
double distance(xdist, ydist)
{
return sqrtf(pow(xdist, 2.0) + pow(ydist, 2.0));
}
int main()
{
double x1, y1, x2, y2;
double xdist, ydist, dist;
int i;
for (i = 1; i <= 2; i++)
{
printf("%d: Enter an x and y coordinate:\n", i);
switch (i)
{
case 1:
scanf("%lf", &x1);
scanf("%lf", &y1);
continue;
case 2:
scanf("%lf", &x2);
scanf("%lf", &y2);
continue;
}
}
xdist = fabs(x2 - x1);
ydist = fabs(y2 - y1);
dist = distance(xdist, ydist);
printf("\nxdist = %.2f, ydist = %.2f\n", xdist, ydist);
printf("The distance between coordinates \n(%.2f, %.2f) to (%.2f, %.2f) \nis %.2f.", x1, y1, x2, y2, dist);
return 0;
}