Why is the output of the following C code 0.000000?
#include <stdio.h>
void foo(float *);
int main()
{
int i = 10, *p = &i;
foo(&i);
}
void foo(float *p)
{
printf("%f\n", *p);
}
Please explain your answer.
Why is the output of the following C code 0.000000?
#include <stdio.h>
void foo(float *);
int main()
{
int i = 10, *p = &i;
foo(&i);
}
void foo(float *p)
{
printf("%f\n", *p);
}
Please explain your answer.
You are making printf()
interpret the bits of an integer as if they were the bits of a float. This is undefined behavior.
What result did you expect?
First of all, when your code looks like it does in your question the line *p = &i;
doesn't do anything at all.
Next - you are passing a pointer to your int
variable to a function that expects float
. Like @unwind mentioned in the comments there is no way for foo()
to know that you lied.
Typecasting is different thing from what you seem to consider it.
#include <stdio.h>
void foo(float *);
int main()
{
int i = 10; //try this and see it fails,
//then switch this line with float i = 10; and try again
foo(&i);
}
void foo(float *p)
{
printf("%f\n", *p);
}
EDIT> If you insist on having a typecast somewhere...
#include <stdio.h>
void foo(float *);
int main()
{
int i = 10;
float p = (float)i;
foo(&p);
return 0;
}
void foo(float *p)
{
printf("%f\n", *p);
}
Variadic functions in C and C++ take variable number of arguments while they don't know type of entry arguments.
In printf
function, the first argument is a string which hopefully indicates the types of following arguments by format specifiers.
In your code, your specifier indicates that the following argument is an float (%f
), but you've passed an integer. Since all this type checking is happening in run-time and compiler knows nothing about it, there is no automatic casting.
Your code invokes undefined behavior.
Try this one:
#include <stdio.h>
void foo(int *);
int main()
{
int i = 10;
foo(&i);
}
void foo(int *p)
{
printf("%f\n",(float) *p);
}