2

Does this change the way the values are stored or incremented at all within the enum? If they are the same, why do people define it as 0x000?

Justin
  • 447
  • 4
  • 10
  • 33
  • 5
    One's 0 in octal, the other's 0 in hex. Same number. – chris Oct 22 '12 at 19:01
  • 4
    @H2CO3 Nope: http://stackoverflow.com/questions/6895522/is-0-a-decimal-literal-or-an-octal-literal – Mysticial Oct 22 '12 at 19:02
  • @Mysticial, Thanks, that's the one I was thinking of. – chris Oct 22 '12 at 19:03
  • Octal indeed. Although, it's moot anyway. – mattjgalloway Oct 22 '12 at 19:03
  • I believe `0` is signed and `0x0000` is unsigned. But that probably doesn't make a difference for an enum. – Mysticial Oct 22 '12 at 19:03
  • @Mysticial bad design of the grammar? –  Oct 22 '12 at 19:05
  • It wont make any difference between 0 and 0x0000 at the start of an enum value though. internally they're all the same. – Aniket Inge Oct 22 '12 at 19:05
  • @Mysticial Just your opinion. (IMHO this is *really* flawed.) –  Oct 22 '12 at 19:06
  • @H2CO3 My opinion is, "I really don't give a sh1t". They're both zero. :) – Mysticial Oct 22 '12 at 19:06
  • 1
    @Mysticial Well, this was somewhat of a shock for me. :) –  Oct 22 '12 at 19:07
  • 1
    I think the reason they included that in the grammar was because if 0 wasn't an octal number it would be ambiguous grammar here: 0(decimal) and 0(octal). I mean how the hell do you represent 0 in octal and decimal? – Aniket Inge Oct 22 '12 at 19:07
  • @PrototypeStark 0 in octal: `00` (would be...) –  Oct 22 '12 at 19:08
  • @H2CO3 and in Decimal? Compiler: "0 identified as octal number oh wait decimal identifier also says 0. AMBIGUOUS and CONFUSED". Computer says "deal with it, noob-compiler!!! HARR!!" – Aniket Inge Oct 22 '12 at 19:13
  • 1
    @H2CO3: There is no *real* flaw. It does not really matter whether the token '0' is considered to be a 0 encoded in octal or in decimal, in any case it is a literal of value 0. Try to rewrite the grammar so that '0' is not an octal literal and you will end up with an equivalent grammar that is slightly more complicated. – David Rodríguez - dribeas Oct 22 '12 at 19:13
  • @DavidRodríguez-dribeas Of course I know this doesn't make a difference, I'm just wondering... –  Oct 22 '12 at 19:14
  • @PrototypeStark No. If the parser is hand-written (which it is...), then looking one token ahead would eliminate all the obscurity. –  Oct 22 '12 at 19:15
  • If 0 is identified as decimal, all the following 'number literals' will have to be taken as decimals too. For example 017 will be equal to 17 in decimal. This is wrong. 017 is octal. I am just saying. :-? – Aniket Inge Oct 22 '12 at 19:15
  • @PrototypeStark `0` as a single-character token could be identified as decimal. `0` and consecutive digits could be identified as octal (as a whole). Voila, no ambiguity :) –  Oct 22 '12 at 19:17

3 Answers3

7

No difference, it's just a readability thing. For instance, it indicates that the enumeration values are used in some sort of binary context, such as bitflags.

enum Flags {
    FLAG_NONE   = 0x0000,
    FLAG_READ   = 0x0001,
    FLAG_WRITE  = 0x0002,
    FLAG_APPEND = 0x0004,
    FLAG_TEXT   = 0x0008,
    FLAG_MEMMAP = 0x0010
};
John Kugelman
  • 349,597
  • 67
  • 533
  • 578
2

No.
0x0000 (append as many 0's as you want) is just 0 in hexadecimal.
Sometimes all your numbers in the enum are hexadecimal. Since there are all hexadecimal your just define the first one in hexadecimal too, because it looks cleaner.

qwertz
  • 14,614
  • 10
  • 34
  • 46
  • don't append beyond 8 zeros, enums are converted to integers internally. Overflow problem if the next number is taken as 0x0000000000000001 – Aniket Inge Oct 22 '12 at 19:09
  • @PrototypeStark not really. Enums are converted to whatever smallest type fits them. So, if you write `0x0000000000001`, and you have a 64-bit integer type, then it's converted to that. –  Oct 22 '12 at 19:10
  • @H2CO3 Depends on your compiler. `gcc` will always use `int` if not modyfied with `__attribute__((packed))` (if so the smallest possible type will be used). But `0x0000000000001` is just 1 in decimal so the size of the enum will be 1 byte. – qwertz Oct 22 '12 at 19:17
  • 1
    The number of zeros does not affect the type of an integer literal. The value is what matters. – bames53 Oct 22 '12 at 19:17
  • @bames53 yep, well spotted - of course I meant `0x1000000000000000`. –  Oct 22 '12 at 19:18
  • @Coodey If I recall correctly, I read this statement of mine in the standard. In this case, either I am remembering wrong, or GCC is non-conforming. –  Oct 22 '12 at 19:19
  • @Coodey: GCC does not always use `int`. It uses `int` if the values will fit in an `int`. GCC is compliant with the standard. `sizeof(Big)` is 8 with `enum Big { val = 0x1000000000000000 };`. – David Hammen Oct 22 '12 at 19:51
0

There is no difference for those specific values, they're exactly the same.

But for other values, remember that prepending 0 makes it an octal constant. This means you want to avoid using values like 000, 001, 002, 010, 044, etc (in an attempt to keep the length of the constants equal).

Mark B
  • 95,107
  • 10
  • 109
  • 188