I have just wrote a code(c#) for my sample exam in C# basics study.
Еven though I was able to write it correctly and receive all points, I am not quite satisfied with the way I have used to cast the char ASCII value to the desired int value. I am asking for a better way to express the following code:
using System;
namespace MultiplyTable
{
class Program
{
static void Main(string[] args)
{
//Input:
string inputNumber = Console.ReadLine();
//Logic:
int firstNumber = 0;
int secondNumber = 0;
int thirdNumber = 0;
for (int i = 0; i < inputNumber.Length; i++)
{
firstNumber = inputNumber[0] - 48;
secondNumber = inputNumber[1] - 48;
thirdNumber = inputNumber[2] - 48;
}
for (int p = 1; p <= thirdNumber; p++)
{
for (int j = 1; j <= secondNumber; j++)
{
for (int k = 1; k <= firstNumber; k++)
{
Console.WriteLine($"{p} * {j} * {k} = {p * j * k};");
}
}
}
}
}
}
The input is an integer three-digit number in the range [111… 999]. I have used string instead of int, to quicker read and store all char values. The issue here is that when I have the char let's say '3' I need to use the int value of '3' and not the ASCII Dec value of 51. As I had a limited time to write this code I succeeded to resolve it by subtracting 48 as you can see in the code provided. What is the correct/more advanced way to do this exercise ? Thank you in advance!