I just wrote a program (with all ushorts) to manipulate numbers and this is part of the code:
ushort digitSum = firstDigit + secondDigit + thirdDigit + fourthDigit;
ushort reverseNumber = fourthDigit + thirdDigit + secondDigit + firstDigit;
ushort lastDigitFirst = fourthDigit + firstDigit + secondDigit + thirdDigit;
ushort middleDigitsSwitched = firstDigit + thirdDigit + secondDigit + fourthDigit;
But it gives me int cast errors. I searched and I think this is the reason:
Why is ushort + ushort equal to int?
Why does C# throw casting errors when attempting math operations on integer types other than int?
It looks like performing operations with types other int requires casts, because ushort + ushort = int.
My questions: Why do we have shorts and bytes? When memory was very limited, we had to worry about which type to use, but do we have to worry about it now? It's annoying to use types smaller than an int, instead can I just use int and longs all the time? Is there a use case for smaller types today?
Edit: How do I do the cast on these to ushort?