I'm creating a set of enum values, but I need each enum value to be 64 bits wide. If I recall correctly, an enum is generally the same size as an int; but I thought I read somewhere that (at least in GCC) the compiler can make the enum any width they need to be to hold their values. So, is it possible to have an enum that is 64 bits wide?
-
1So if I understand well, 2^32 enums are not enough for you ? Or is it an alignement concern, why do you need those to be 64 instead of 32, I'm very curious. – jokoon May 06 '12 at 14:27
-
2@jokoon: I honestly don't remember anymore. I think I wanted the enums to contain values larger than 2^32-1. – mipadi Jan 04 '13 at 17:37
-
One use would be if you needed a union between an enum and a pointer. – Demi Dec 21 '13 at 01:03
-
3An important consideration in the size of `enum` is in fact memory use. Is memory optimization dead or something, or does everyone think the compiler is the center of the universe still and it automagically makes everything fast and optimal without any effort on the part of the programmer? It's absurd to use a larger data type than you need, and if I only need 256 values or less for my enum, then why do I need 16 or 32-bit words to store them? (Data model isn't an excuse. The values usually are quite easily sign-extended such as when stored in the registers.) – AMDG May 05 '21 at 18:17
7 Answers
Taken from the current C Standard (C99): http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1256.pdf
6.7.2.2 Enumeration specifiers
[...]
Constraints
The expression that defines the value of an enumeration constant shall be an integer constant expression that has a value representable as an int.
[...]
Each enumerated type shall be compatible with char, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined, but shall be capable of representing the values of all the members of the enumeration.
Not that compilers are any good at following the standard, but essentially: If your enum holds anything else than an int, you're in deep "unsupported behavior that may come back biting you in a year or two" territory.
Update: The latest publicly available draft of the C Standard (C11): http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1570.pdf contains the same clauses. Hence, this answer still holds for C11.

- 5,641
- 10
- 27

- 177,530
- 117
- 400
- 535
-
3having only that, the following is valid i think: enum { LAST = INT_MAX, LAST1, LAST2 }; so LAST2 is not representable in int, but there wasn't an expression defining it. – Johannes Schaub - litb Dec 14 '08 at 01:33
-
4In the actual PDF it defines that: "The identifiers in an enumerator list are declared as constants that have type int[...]". I've omitted that to make it not too verbose. – Michael Stum Dec 14 '08 at 01:36
-
3Note "*a* signed integer type, or *an* unsigned integer type". Not necessarily `int`. `short` and `long` are integer types too, and whatever the implementation picks, all values must fit ("*shall* be capable of representing the values of all the members of the enumeration"). – Feb 18 '16 at 17:45
-
9Notable: _enumeration constant_ and _enumerated type_ **is not the same thing**. The former are the contents of the enum declaration list, the latter is the actual variable. So while the enumeration constants must be `int`, the actual enumeration variable could be another type. This is a well-known inconsistency in the standard. – Lundin Sep 25 '17 at 11:56
-
5To clarify Lundin's point: For `enum my_enum { my_value }`, `my_value` will have type `int`, but `enum my_enum` can have an implementation defined type which must at least represent all the enumeration values. So `my_value` may have a narrowing conversion to `enum my_enum`, but it's guaranteed not to overflow. – P O'Conbhui Nov 26 '17 at 16:41
-
2
-
An enum
is only guaranteed to be large enough to hold int
values. The compiler is free to choose the actual type used based on the enumeration constants defined so it can choose a smaller type if it can represent the values you define. If you need enumeration constants that don't fit into an int
you will need to use compiler-specific extensions to do so.

- 106,424
- 25
- 145
- 137
-
14Your first sentence seems to conflict with your last. Is the constraint that an `enum` should be larger than an `int` or smaller? Following @MichaelStum 's answer your first sentence should be "An `enum` is only guaranteed to fit into an `int` value." – HaskellElephant Jun 20 '14 at 09:23
-
As an ugly implementation sensitive hack on twos complement platforms (is that all systems these days?), you can force an enum to be as large as an int by making sure it contains negative values. Not a recommended technique though. – persiflage Sep 23 '16 at 19:37
-
7This answer seems to suggest that an enum is as large as an `int`. [Michael Stum's answer](https://stackoverflow.com/a/366033/971090), which references C99, says that an enum may be as small as a `char`. – Frank Kusters Sep 15 '17 at 06:46
-
3The first sentence of this answer is incorrect. The `enum` is only guaranteed to be large enough to hold the value of the largest enumerator in the enum. – M.M Sep 28 '18 at 08:11
-
@M.M, he meant to hold a maximum of `int` values. It is a correct statement. Your statement is actually incorrect. An `enum` will not hold the largest enumerator if that enumerator is larger than an `int`. I learned this by having my 64-bit enumerator truncated to the maximum of `int` (which is 32-bit in my case). – SO_fix_the_vote_sorting_bug Jun 05 '21 at 13:57
-
@jdk1.0 in Standard C all enumerators have type `int` . The behaviour you described is compiler-specific extensions – M.M Jun 06 '21 at 04:08
While the previous answers are correct, some compilers have options to break the standard and use the smallest type that will contain all values.
Example with GCC (documentation in the GCC Manual):
enum ord {
FIRST = 1,
SECOND,
THIRD
} __attribute__ ((__packed__));
STATIC_ASSERT( sizeof(enum ord) == 1 )
-
14Actually, as far as I can see this does not break the standard. As explained in Michael Stum's answer, the standard allows the compiler to choose the actual type of the enums, as long as all values fit. – sleske Sep 07 '15 at 10:27
-
2I've worked with MacOS C++ compilers that do exploit the limited range of values in an enum to store them in smaller types. Can't remember if it was Metrowerks Codewarrior or XCode. This is within the C++ standard. You cannot assume sizeof(MyEnum) == sizeof(int) in general. – persiflage Sep 23 '16 at 19:29
Just set the last value of the enum to a value large enough to make it the size you would like the enum to be, it should then be that size:
enum value{a=0,b,c,d,e,f,g,h,i,j,l,m,n,last=0xFFFFFFFFFFFFFFFF};

- 59
- 7
-
3While this code may answer the question, providing additional context regarding how and/or why it solves the problem would improve the answer's long-term value. – leopal Jan 15 '20 at 11:53
-
1This is false. Compiling your example with gcc 8.4.0: `sizeof(a)` is 4 and `sizeof(last)` is 8. – Mateo de Mayo Mar 03 '21 at 16:28
-
2typedef enum {a1=0,b1,c1,d1,f1,last1=0xFFFFFFFFFFFFFFFF} testing1; typedef enum {a2=0,b2,c2,d2,f2,last2=0xFFFFFFFF} testing2; sizeof a variable of type testing1 is 8 and and sizeof a variable of type testing2 is 4.. – scirdan Mar 04 '21 at 16:45
In C language, an enum
is guaranteed to be of size of an int
. There is a compile time option (-fshort-enums
) to make it as short (This is mainly useful in case the values are not more than 64K). There is no compile time option to increase its size to 64 bit.
Consider this code:
enum value{a,b,c,d,e,f,g,h,i,j,l,m,n};
value s;
cout << sizeof(s) << endl;
It will give 4 as output. So no matter the number of elements an enum
contains, its size is always fixed.
-
7Michael Stum's answer is correct. This is compiler specific. You can try it out yourself with IAR EWARM. IAR EWARM shows 1 for your example. If there is up to 255 items it still shows 1. After adding 256th item it goes up to 2. – desowin Nov 25 '15 at 07:43
-
16
-
7important things to realise before writing any more C or C++: Just because it compiles doesn't mean it's legal per the Standard. Just because you get a given result doesn't mean the Standard says that you always will or that other users will when they run your code. Questions like this need an answer that references the Standard or *at least* implementation-defined spec for a given compiler/ABI. Simply compiling and running a program and seeing one result on one day conveys no lesson about such questions (and very little about anything else). – underscore_d Sep 16 '18 at 16:20