0

I have a similar problem as mentioned in by this guy

C++: Getting random negative values when converting char to int

but i just want to know how to solve the problem and the code should not get more slow. I welcome all suggestions.

I tried to make the char to be unsigned char but that didn't work, then i tried to use this code :

const char char_max = (char)(((unsigned char) char(-1)) / 2);

c = (num & char_max);

but the output was different and i don't know exactly what that code does.

I am still a student.

cout << "\nEnter any string : ";
    cin >> s1;

    for (char& c : s1)
    {
    num = c;
    num= (num+rand())%256;
            // const char char_max = (char)(((unsigned char) char(-1)) / 2);
            //c = (num & char_max)
    c = num;
    }
    cout <<"\n"<< s1;

I expect c to have normal ASCII values so that i can use it back to retrieve the original int value

Thanks!!

JBK2
  • 3
  • 4
  • 1
    ASCII table defines characters in the range 0-127, and those in the range 0-31 are unprintable control characters. You produce characters outside those ranges - what output exactly do you expect to see? How do you define "normal ASCII values"? – Igor Tandetnik Jan 21 '19 at 17:59
  • Without any sign, in the range of 0-255 – JBK2 Jan 22 '19 at 01:53
  • `num` gives you that. `c` cannot be that, as `char` is a signed integer between -128 and 127. However, you expect to take that `[0, 255]` value, stuff it back into a `char` and have something printable - that may or may not work, randomly (literally, in this case). – Igor Tandetnik Jan 22 '19 at 02:18
  • So char has a range of -128 to 127 not 0 to 255 – JBK2 Jan 22 '19 at 11:33
  • Can you show some example input? How do you want to change each character? What is the expected output? – VLL Jan 22 '19 at 11:40
  • To quickly solve your issue, replace `char` with `unsigned char`. But make sure that you understand what you do, what a char is and how ASCII is defined. Furthermore, as Igor Tandetnik pointed out, you should not allow control characters. Also, you use a 0-255 range, actual ASCII has only a range of 0-127. The other possible values of the byte are often used for extended symbols, but make no mistake, those are no standard. Symbol #203 could be Ë, but it could also be ╦, or you could define anything by yourself. If you want such symbols, rather use UTF8. – Aziuth Jan 22 '19 at 12:00
  • (There are further standards defined. In the ISO 8859-1 standard, #203 would be Ë, ╦ is from the CP437 standard.) – Aziuth Jan 22 '19 at 12:07
  • Thanks guys, it work now. I used unsigned char in a bit different way that before and it worked. Thank you @IgorTandetnik – JBK2 Jan 22 '19 at 16:35
  • And thank you @Aziuth – JBK2 Jan 22 '19 at 16:36

1 Answers1

0

The language doesn't tell us whether char is signed or unsigned; this depends on your platform. Apparently, on yours (like many others), it is signed.

That means, assuming 8-bit bytes, its range is [-128, 127], not [0, 255].

Use an unsigned char throughout to deal with numbers in the range [0, 255].

(I can't suggest a specific change to your program, because you didn't let us see it.)

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055