It seems there is no difference between intializing a char with 0 and '\0' in generated binary code in C. If so, why do we have '\0' in C and not just 0 for ending string?
-
1`'\0' is char`..how? – Sourav Ghosh Mar 14 '18 at 06:02
-
`'\0'` is not a `char`. It is a _character constant_ of type `int`. – chux - Reinstate Monica Mar 14 '18 at 06:23
-
BTW, in C++ instead there *is* a difference - `0` is an `int`, `'\0'` is a `char`. – Matteo Italia Mar 14 '18 at 06:38
3 Answers
The only thing I can think of is readability, as '\0'
is a better indication to people who read your code that you want a char here.
Aside from that, I have to admit there's no difference at all. In fact, readability isn't even an issue for experienced C programmers.

- 14,123
- 11
- 61
- 126

- 35,554
- 7
- 89
- 134
Yeah, there's no difference in initializing a single character. But the purpose of \0
is to allow you to embed a null character into a string literal:
"foo\0bar"
Just like any other escape sequence. The fact it's identical to 0 for a single character is just a happy (designed) coincidence. You are naming an octal value with the escape sequence.

- 165,132
- 21
- 377
- 458
-
-
@ikegami - I was being ironic. Of course it's *designed* to give the same octal values. – StoryTeller - Unslander Monica Mar 14 '18 at 06:37
-
Take the string "abc123", try embedding a null character after "abc" in it. See what happens. – Art Mar 14 '18 at 07:06
-
@Art - And? String literals have greater utility than merely specifying null terminated strings. That includes having a 0 character in the dead center. "See what happens", right. – StoryTeller - Unslander Monica Mar 14 '18 at 07:08
-
@StoryTeller It should demonstrate two things: 1. it's not a happy conincidence. 2. It's probably a bad idea to tell people that the purpose of '\0' is to embed null characters in the middle of strings since it has quite terrible failure modes. Like trying to embed a null character before other numbers. – Art Mar 14 '18 at 07:11
-
1@Art - (1) See my comment to ikegami. (2) With all due respect "it's probably a bad idea" only as a matter of your opinion. Telling people they can embed a null character in a literal, or any character with any value for the matter, is meant to introduce them to a wider world than mere null terminated ASCII text. – StoryTeller - Unslander Monica Mar 14 '18 at 07:13
-
Or a wider world embedding a newline if they happen to want to embed a null character between "abc" and "123". – Art Mar 14 '18 at 07:17
-
@Art what is your terrible failure mode for putting a \0 byte before other numbers? – Gerhardh Mar 14 '18 at 07:18
You can use a variety of escape sequences in string literals:
"\n \t \x0a \123"
// symbolic, hex, octal
All of these can also be used in character literals:
'\n', '\t', '\x0a', '\123'
So why exclude '\0'
(a particular octal value) specifically? Why wouldn't it be valid in character literals?
While there is no semantic difference between 0
and '\0'
, they can be used to express the programmer's intent: 0
could be any integer zero, but '\0'
is generally only used for NUL bytes.

- 84,125
- 8
- 85
- 148