0

in my C++ program, I have the following line:

text += (char)value;

The value is an ASCII value and whenever this value is greater than 127, the corresponding character (added to the text) has a negative value. Why ? And how can I prevent that ? Note: text is just a string

I have found this on StackOverflow. A related thread but no solutions were provided there.

abdullah celik
  • 511
  • 1
  • 6
  • 17
  • 2
    It's *implementation defined* (i.e. it's up to the compiler) if `char` is signed or unsigned. And as on almost all system `char` is an 8-bit type then a signed `char` will have a range from `-128` to `127`. – Some programmer dude Sep 21 '21 at 16:01
  • On another note, if you ever feel the need to do a C-style cast (like `(char)` in `(char)value`) then you should take that as a sign that you're doing something wrong. If the compiler gives you a warning about it, then it probably have a good reason, and the right solution is almost never to do a cast to silence the compiler. – Some programmer dude Sep 21 '21 at 16:11
  • 5
    There are no ASCII characters with values greater than 127. – n. m. could be an AI Sep 21 '21 at 16:17
  • 2
    `static_cast(value)` – nurettin Sep 21 '21 at 16:22
  • What is it that you're trying to do? – Joseph Larson Sep 21 '21 at 16:55
  • [ES.48: Avoid casts](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#es48-avoid-casts) [ES.49: If you must use a cast, use a named cast](https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines#es49-if-you-must-use-a-cast-use-a-named-cast) – MatG Sep 21 '21 at 17:26
  • @nurettin: thanks. it worked. – abdullah celik Sep 21 '21 at 17:39
  • Does this answer your question? [Why is 'char' signed by default in C++?](https://stackoverflow.com/questions/17097537/why-is-char-signed-by-default-in-c) – phuclv Sep 22 '21 at 06:41
  • [Why don't the C or C++ standards explicitly define char as signed or unsigned?](https://stackoverflow.com/q/15533115/995714), [Is char signed or unsigned by default?](https://stackoverflow.com/q/2054939/995714) – phuclv Sep 22 '21 at 06:45

1 Answers1

0

char is by default signed, and its size is one byte (8 bits). A byte allows representing 256 different values. When said byte is signed, it can represent values ranging from -128 and 127. If you want it to represent numbers between 0-255, you should cast your variable value to unisgned char. text += (unsiged char)value; would work fine.

Anyways, there is not a char represented by an ascii value greater than 127.

Edit: as mentioned in a comment, char is not signed by default. However, it is signed when using g++ and the vs compiler on windows.

alonkh2
  • 533
  • 2
  • 11
  • 1
    "`char` is by default signed" is *wrong*. It's *implementation defined* (i.e. up to the compiler) if `char` is signed or unsigned. Even the same compiler can have different configurations depending on target systems. – Some programmer dude Sep 22 '21 at 03:39
  • `However, it is signed when using g++` this is **wrong**. `char` is usually unsigned by default on ARM implementations even with gcc: [Why unsigned types are more efficient in arm cpu?](https://stackoverflow.com/q/3093669/995714), [Does anyone know why "char" is unsigned on ARM/gcc?](https://news.ycombinator.com/item?id=18269886), [Any compiler which takes 'char' as 'unsigned' ?](https://stackoverflow.com/q/3728045/995714) – phuclv Sep 22 '21 at 06:49
  • @phuclv please note that I only mentioned g++ on windows. – alonkh2 Sep 22 '21 at 07:04