If I declare signed char k = 'a'
then will the value of k
not be 97, which is the ASCII value of 'a', as the range of signed char
is -128 to 127?
If not then what does it mean by range of signed char
?
If I declare signed char k = 'a'
then will the value of k
not be 97, which is the ASCII value of 'a', as the range of signed char
is -128 to 127?
If not then what does it mean by range of signed char
?
Why would a value suddenly change? The ASCII table is a list of constants, and 'a'
always matches 97 in any language/environment that conforms to the ASCII-standards (so more or less all of them).
The range consists of the minimum and maximum value that can be contained in a variable of a given type.
A signed char
has 8 bits of which 1 for the sign, meaning 7 bits of actual data and it can be positive or negative. Therefore it can contain integer values between -128 and 127.
An unsigned char
has 8 bits, all for actual data, meaning it can contain integer values between 0 and 255.
(the above is naively assuming a modern system in which a char
contains 8 bits, it could theoretically be more. Also, as explained in that link, the bottom limit of a signed char
could also be -127 on your specific system - use limits.h
to be sure)
I'm not sure if this is what you're asking, but the C++ standard promises that all of the characters in the basic execution character set will be non-negative. From C++11 2.3/3 :
For each basic execution character set, the values of the members shall be non-negative and distinct from one another
The basic execution character set is the 26 letters (upper and lower case) a-z and A-Z, the digits 0-9, most the set of punctuation marks that are in the ASCII character set (for example, '$' and '@' are not included) as well as space and a few core control characters.
However, other characters might have a negative representation when signed char
is used. For example, in one possible code page the Euro character, '€', has a value of 128 if char
is unsigned, but a value of -128 if char
is signed.