1

Given the following code. this problem is Leet Code 415.

string addStrings(string num1, string num2) {
        
        string res;
        int sum = 0;
        int i = num1.size() - 1;
        int j = num2.size() - 1;
        
        
        while(i >= 0 && j >= 0)
        {
            sum += (num1[i--] - '0') + (num2[j--] - '0'); // this problem point
            res.push_back(char(sum%10 + '0'));
            sum = sum/10;
        }
        
        while(i >= 0)
        {
            sum += (num1[i--] - '0');
            res.push_back(char(sum%10 + '0'));
            sum = sum/10;
        }
        
        while(j >= 0)
        {
            sum += (num2[j--] - '0');
            res.push_back(char(sum%10 + '0'));
            sum = sum/10;
        }
        
        if(sum > 0)
            res.push_back(char(sum%10 + '0'));
        reverse(res.begin(), res.end());
        
        return res;
    }

I don't understand the process converting string to int. Why is it a int when I subtract '0' from string? If it doesn't change int, how is it possible to operate on strings?

David Kim
  • 39
  • 6
  • 2
    `num1[i--]` is (in essence) a `char`. So is `'0'`. Both are integer types – Lala5th Aug 12 '21 at 09:47
  • 1
    [link](https://stackoverflow.com/questions/25406076/subtract-letters-in-python) This might help to understand, how you can subtract chars – Tanque Aug 12 '21 at 09:52
  • 1
    '0' = [integer value 48](https://www.asciitable.com/). 0 = integer value 0. Subtracting '0' is subtracting 48, giving the integer value that the character actually represents. – GazTheDestroyer Aug 12 '21 at 09:57
  • 1
    duplicate: [C++- Adding or subtracting '0' from a value](https://stackoverflow.com/q/37683200/995714), [What does '0' mean in a subtraction?](https://stackoverflow.com/q/59911398/995714) – phuclv Aug 12 '21 at 10:06

2 Answers2

3

So the way ascii works is that consecutive numbers are following each other. If a string of characters only contain digits, you can get those digits by subtracting the value '0' from them. In ascii '0' = 0x30, '1' = 0x31 etc..

In your code num[i--] - '0' just checks "how far" you are from '0' in the ascii table, giving the correct digit if it is indeed a digit.

Also you don't convert string to anything (at least on the specific line). You access an element, which is a char that is an integer type in c++.

Lala5th
  • 1,137
  • 7
  • 18
3

In c++ characters can be implicitly cast to integers using their ASCII codes. I don't really want to spoil the fun of solving the given problem so i'll just provide a hint here:
Given a single digit number '2' and '4' with an ASCII code of 50 and 52 (decimal) respectively, subtracting '0' with an ASCII code of 48 from both numbers, you get the actual numerical values of the characters (50-48 = 2) and so on.
Have fun coding!

mrxaxen
  • 79
  • 4
  • They aren't implicitly cast, but rather are integer types. See:https://eel.is/c++draft/basic.fundamental#1 – Lala5th Aug 12 '21 at 10:02
  • @Lala5th `char` is not an integer type. It *is* an integral type. And `char` used as an arithmetic operand is indeed implicitly converted to `int` before the operation. "Implicit cast" is self-contradictory though since cast is an *explicit* conversion. – eerorika Aug 12 '21 at 10:06
  • @eerorika I literally linked the standard: "There are five standard signed integer types: “signed char”, “short int”, “int”, “long int”, and “long long int [...] For each of the standard signed integer types, there exists a corresponding (but different) standard unsigned integer type: “unsigned char”, “unsigned short int”, “unsigned int”, “unsigned long int”, and “unsigned long long int" [...] Type char is a distinct type that has an implementation-defined choice of “signed char” or “unsigned char”" – Lala5th Aug 12 '21 at 10:07
  • @Lala5th And that list literally doesn't contain `char`. That said, I'll have to retract my claim though, since integral type is apparently synonym for integer type. – eerorika Aug 12 '21 at 10:09
  • @eerorika I included the part where it does say that `char` is implementation defined for `unsigned char` or `signed char`, both being on the list – Lala5th Aug 12 '21 at 10:10
  • @Lala5th You cut off the rest of the sentence "... as its underlying type.". Just because `char` is implemented with underlying type being one of those, it isn't either of those types, and those types being signed and unsigned integer types has no effect on `char` being in those categories, and it isn't. But Like I said, I've retracted my claim, since I found out that char is indeed an integer type because that's a synonym for integral type. – eerorika Aug 12 '21 at 10:12
  • @Lala5th Pointing back to the same link that's provided about the standard, both of you are correct it seems. https://eel.is/c++draft/basic.fundamental#11 Thanks for the information! – mrxaxen Aug 12 '21 at 10:13
  • 1
    `In c++ characters can be implicitly cast to integers using their ASCII codes` C++ doesn't say anything about encoding. EBCDIC or any other encodings are also allowed – phuclv Aug 13 '21 at 01:41