I think this question has already been answered in a couple of previous posts. The basic idea is:
A) Consider this program in C:
#include <stdio.h>
#define x 3.0 /* without suffix, it'll treat it as a double */
#define y 3.0f /* suffix 'f' tells compiler we want it as a float */
int main() {
printf("%ld\n", sizeof(x)); /* outputs 8 */
printf("%ld", sizeof(y)); /* outputs 4 */
return 0;
}
Basically, double has more precision that float so it's more preferable in case of ambiguity. So, if you're declaring constant without suffix 'f', it'll treat it as a double.
B) Look into this one now:
#include <stdio.h>
#define x 'a'
int main() {
char ch = 'b';
printf("%ld\n", sizeof(x)); /* outputs 4 */
printf("%ld", sizeof(ch)); /* outputs 1 since compiler knows what it's
exactly after we declared & initialized var
ch */
return 0;
}
That constant value ('a') is converted to it's ASCII value (097) which is a number literal. Hence, it'll be treated as an integer.
Please refer this link for more details:
https://stackoverflow.com/questions/433895/why-are-c-character-literals-ints-instead-of-chars#:~:text=When%20the%20ANSI%20committee%20first,of%20achieving%20the%20same%20thing.