Just a general question about programming: When you define a value in C (or any language I suppose), How does the compiler known how to treat the value? For example:
#define CountCycle 100000
I would assume CountCycle
is a "long integer" data type, but that's just an assumption. I suppose it could also be a float
, a double
(not an int
as it maxes out at ~32k), etc.
How does the compiler choose the data type for a #define
value? I have no application for the answer to this question; I'm just curious.