3

I have gone through already existing information available on

difference between typedef and define & typedef,#define

All replies are really informative. I just came across one more example, can anyone please provide the reason for this behavior .

For declaring unsigned int variable this will work:

#define INTEGER int
unsigned INTEGER i = 10;

But, it won't work in case of typedef,

e.g.

typedef int INTEGER;
unsigned INTEGER i = 10;

it will throw error: ‘i’ undeclared (first use in this function)

Thanks in Advance!

Community
  • 1
  • 1
Arti
  • 293
  • 2
  • 3
  • 10

3 Answers3

3

The preprocessor just makes copies and pastes. When you write:

#define INTEGER int
unsigned INTEGER i = 10;

Before the compiler itself, the preprocessor turns the code in:

unsigned int i = 10;

However, typedef is treated by the compiler.

C11 (n1570), § 6.7.8 Type definitions

A typedef declaration does not introduce a new type, only a synonym for the type so specified. That is, in the following declarations:

typedef T type_ident;
type_ident D;

type_ident is defined as a typedef name with the type specified by the declaration specifiers in T (known as T), and the identifier in D has the type ‘‘derived-declarator- type-list T ’’ where the derived-declarator-type-list is specified by the declarators of D.

In other words, <INTEGER> is a synonym of <int>. But <unsigned INTEGER> is treated as a single type, which doesn't exist.

md5
  • 23,373
  • 3
  • 44
  • 93
1

typedef creates a type alias to the exact type you have in the typedef declaration. That type is recognized by the compiler as a true type. In your case there will be a proper type INTEGER, but there is no unsigned INTEGER type which is why it's an error.

A macro (#define) is replaced before the compiler even sees it, by the preprocessor. So for the compiler the first case is simply

unsigned int i = 10;
Some programmer dude
  • 400,186
  • 35
  • 402
  • 621
1

The reason is that typedef is not an 1:1 literal text substitution. Basically, in the first case, unsigned INTEGER is expanded to unsigned int, which is a valid type name (only certain combinations of type names and qualifiers are allowed in C, and unsigned int is one of them.)

In the second case, however, unsigned INTEGER is not a valid combination of type names.