Right.
You shouldn't be asking these questions here because the whole point of going to school is to actually learn it instead of asking the solution, BUT, been there, done that... so here we go :D
There's a couple of things you're doing wrong.
integers
"don't divide" (they do, but they sort of don't), you won't get decimal values from them so 1 divided by 2 is 0 although mathematically is 0.5.
- If you're gonna use int you need to cast the final to a double, you're using
int total = programmingMajors/studentsTotal;
so this is an integer. If you have a total of 10 students, with 5 majors. 5/10 = 0.5 with decimals, 0 in integer. Change it to (double)majors/total
and you'll have the 0.5 you need.
- You're not really getting the percentage of the students just dividing the value. To get the percentage you need to use rule of three.
If 1000 students are 100% of the students, X majors are Y% of them.
e.g. 1000 students. 50 majors.
1000 students = 100 percent.
50 students = X
1000x = 100 * 50
1000x = 5000
x = 5000 / 1000
x = 5
In one single line would be var result = (100 * majors) / total
When using types such as int, doubles and decimals always check their range. Use this microsoft ref. After changing to double and use rule of three.
I ended up not showing how to do it in your code.
// just change the last line to this.
Console.WriteLine("The percent is: " + (100 * programmingMajors) / studentsTotal);