0

Why does a casted char c stored in an int not be the same as the original char?

I have looked all over for an explanation and I can't seem to be able to store a char in an int and be able to compare it to another int.

#include <iostream>
using namespace std;

int main() {

    char c = 255;
    int x = (int)c;
    cout << (x == 255) << endl;

    return 0;
}

This outputs 0. Why?

  • What do you get as the result of `(c < 0)`? Or if you `#include `, what do you get for `std::numeric_limits::max()`? – aschepler Dec 23 '20 at 04:18
  • Either `x == c` or `x == (char)255` will evaluate to true, due to the default integer promotions. But `x == 255` translates to `-1 == 255` on machines where the `char` is signed. – dxiv Dec 23 '20 at 04:19
  • `cout << (c < 0) << endl;` outputs `1`. This doesn't make a lot of sense to me. – Dylan Meiners Dec 23 '20 at 04:20
  • I understand that `x == c` and `x == (char)255` will equal true, but I am wondering why after casting the `c` to an `int` and comparing it to another `int` like `255` will return false. – Dylan Meiners Dec 23 '20 at 04:24
  • Because `11111111 != 11111111111111111111111111111111` `char` is (signed), when cast to `int` the value is **sign-extended** to 32 bits. – David C. Rankin Dec 23 '20 at 04:25
  • @DylanMeiners Because `c = 255` is equivalent to `c = -1` when the `char` is signed. – dxiv Dec 23 '20 at 04:25
  • @dxiv Usually. Conversions to a signed integer type from a value outside its range have implementation-defined behavior. Most implementations define it to wrap modulus `(1< – aschepler Dec 23 '20 at 04:30
  • @aschepler That's a good point to keep in mind, indeed. – dxiv Dec 23 '20 at 04:36

1 Answers1

3

The cast isn't the problem here. Trying to assign 255 into a char, which is an 8 bit signed type, is. This most likely results in all bits being set, and is interpreted as -1, but you shouldn't be doing it in the first place. Try with 127 or lower, and it will print 1.

Boris Lipschitz
  • 1,514
  • 8
  • 12
  • 4
    Note that it is implementation-defined whether `char` is signed or unsigned. – Eugene Dec 23 '20 at 04:24
  • @Eugene, true in theory. Show me a single actual compiler where it isn't the case, though.I am ready to bet any amount on it being 8 bit signed value here, and also ready to bet rather high amount on 255 just being loaded into the bits and becoming -1. – Boris Lipschitz Dec 23 '20 at 04:29
  • Yes, that was the case. Switching the type to an `unsigned char` was the fix. – Dylan Meiners Dec 23 '20 at 04:33
  • @BorisLipschitz You can force a compiler to use unsigned char by default with a command line flag: https://stackoverflow.com/a/20518559/459565. – Eugene Dec 23 '20 at 04:38
  • 1
    Okay, one actual compiler where `char` is unsigned is gcc. (If the target architecture is ARM, for one case.) – aschepler Dec 23 '20 at 04:39
  • Really? Gcc (arm target) has unsigned char by default?! I mean without forcing it with command line switch. – Boris Lipschitz Dec 23 '20 at 04:44
  • 1
    https://wiki.debian.org/ArchitectureSpecificsMemo#Summary – kakkoko Dec 23 '20 at 04:55
  • @BorisLipschitz -- "I've never seen that happen" is not sound engineering. – Pete Becker Dec 23 '20 at 17:50