0

I have been programming JavaScript for nearly a year and a half now, and have recently began learning Java syntax. When trying to recreate a simple program that I wrote in JavaScript, I have ran into trouble when using the Double data type. Here is my code:


Scanner input = new Scanner (System.in);
int questionsTotal;
int questionsCorrect;
double grade;
System.out.println("How many questions were there in total?");
questionsTotal = input.nextInt();
System.out.println("How many questions did you get correct?");
questionsCorrect = input.nextInt();
grade = questionsCorrect / questionsTotal * 100;
System.out.println("Your grade is: " + grade);

It isn't in this snippet, but java.util.Scanner is imported. When the user enters values for questionsTotal and questionsCorrect, they process just fine. However, the grade variable does not function as planned. When printed, it displays a value of either 0.0, or 1.00, even if the input should be a decimal value. I have tried using printf rather than println, but to no avail. Am I using the wrong data type? Can doubles not handle a decimal value?

I would appreciate feedback on the possible semantic error, and apologize for my lack of experience in the field. Please let me know if there is any more information that can be given. All answers are appreciated.

Sollybird
  • 161
  • 1
  • 11

1 Answers1

0

Try to replace this line in your code:

grade = ((double)questionsCorrect) / questionsTotal * 100;
Hicham
  • 720
  • 6
  • 9