-1

I was working on some project and need to do some math :

decimal X = (Value / 881) * (item.Type ? 130: 130 * 2);

the param "Value" equals 3000 for example.

if "Value" is of type int the result is 390.. if "Value" is of type decimal the result is 442.67

how is this possible ??

.NET Fiddle

Dr.Vision
  • 85
  • 1
  • 5
  • 14
  • 2
    when `Value` is of type `int`, you have *integer division*: `Value / 881` which is `3000 / 881 == 3`; when `Value` is `decimal` then `3000m / 881 == 3.405...m` – Dmitry Bychenko Apr 01 '21 at 07:22
  • The more *unprecise* (e.g. division which produce remainder) operations you do with `int`, the bigger is the accumulated error of the result. – Sinatr Apr 01 '21 at 07:22

1 Answers1

0

Because of Decimal value. If you calculate your formula step by step you will understand this difference is due to values which are coming after decimal point when you are using decimal as a type

When you divide 3000(integer) by 881:

int Value = 3000
//Output is 3. Output is in integer
decimal X = (Value / 881);  //When int is divided by int then result is in int

When you divide 3000(decimal) by 881:

decimal Value = 3000
//Output is 3.4052213393870601589103291714. Output is in decimal.
decimal X = (Value / 881);  //When decimal is divided by int then result is in decimal

.Net fiddle

I hope .net fiddle will give you a better idea of my answer

Prasad Telkikar
  • 15,207
  • 5
  • 21
  • 44