Why i am getting the different values of i.
#include <stdio.h>
void pri(int,int);
int main()
{
float a=3.14;
int i=99;
pri(a,i);
getch();
}
void pri(int a,int i)
{
printf("a=%f i=%d\n",a,i);
printf("i=%d a=%f\n",i,a);
}
Why i am getting the different values of i.
#include <stdio.h>
void pri(int,int);
int main()
{
float a=3.14;
int i=99;
pri(a,i);
getch();
}
void pri(int a,int i)
{
printf("a=%f i=%d\n",a,i);
printf("i=%d a=%f\n",i,a);
}
You declared a
as an int, but you are using %f
so it should be declared as a float:
void pri(float a, int i)
{
printf("a=%f i=%d\n", a, i);
printf("i=%d a=%f\n", i, a);
}
If you have an incorrect type you get undefined behaviour. The specification for printf
reads (7.19.6.1 paragraph 9):
If a conversion specification is invalid, the behavior is undefined. If any argument is not the correct type for the corresponding conversion specification, the behavior is undefined.
Emphasis mine.
to explain your comment's question:
printf("a=%f i=%d\n", a, i);
will evaluate to something like:
printf("a=");
printf(/*next arguments up to size of float*/ a, i);
printf(" i=");
printf(/*...decimal*/ /*whatever is on the stack, because a and i are already consumed*/);
you can rescue your function with: 'printf("a=%f i=%d\n", (float)a, i);'