0
first = float(input("Enter first number: "));
second = float(input("Enter second number: "));
avg = float((first + second) / 2);
print(str(avg));

Using the numbers 1.1 and 1.3 as inputs, the expected output is 1.2. However, the result I'm receiving is 1.2000000000000002. I understand that this is related to Python and it's datatypes.

However, I'm unsure of how to evaluate this correctly, or why this specific result is achieved.

EDIT: Python 3.2

Thirk
  • 571
  • 1
  • 7
  • 24

1 Answers1

3

Use decimals:

import decimal

first = decimal.Decimal('1.1')
second = decimal.Decimal('1.3')
avg = (first + second) / 2
print(avg)
Noel Evans
  • 8,113
  • 8
  • 48
  • 58