Would it be char, byte, Int16, Int32, Int64 (maybe last three unsigned, since I wouldn't have negative numbers?).
I need it for multiplication and adding. The smaller numbers can contain a type, the more parts a big number will be divided into.
An example: 1234567898765321
In char: {1, 2, 3, 4, 5, 6, 7, 8, 9, 8, 7, 6, 5, 4, 3, 2, 1}
In Int32: {123456789, 87654321}
So, which is faster to use for billions of calculations?