1

For representing money I know it's best to use C#'s decimal data type to double or float, however if you are working with amounts of money less than millions wouldn't int be a better option?

Does decimal have perfect calculation precision, i.e. it doesn't suffer from Why computers are bad at numbers? If there is still some possibility of making a calculation mistake, wouldn't it be better to use int and just display the value with a decimal separator?

ProgrammingLlama
  • 36,677
  • 7
  • 67
  • 86
user13288253
  • 513
  • 1
  • 4
  • 9
  • I'm sure that would work here in Japan where we don't use decimals, but using `int` might cause problems with calculations in other countries. Unless you intend to have `int value = 100;` mean $1 or something. – ProgrammingLlama Apr 26 '20 at 14:09
  • 1
    @John, yes 100 = $1 is exactly what I meant by "display the value with a decimal separator". – user13288253 Apr 26 '20 at 14:23
  • It ignores the fundamental issue: calculation results are always an approximation for practical currency units. int doesn't help much: pay back a $1 loan in 7 installments, 100 / 7 = 14, 7 * 14 = 98, 2 pennies short. – Hans Passant Apr 26 '20 at 14:37

3 Answers3

2

The amounts you are working with, "less than millions" in your example, isn't the issue. It's what you want to do with the values and how much precision you need for that. And treating those numbers as integers doesn't really help - that just puts the onus on you to keep track of the precision. If precision is critical then there are libraries to help; BigInteger and BigDecimal packages are available in a variety of languages, and there are libraries that place no limit on the precision Wikipedia has a list The important takeaway is to know your problem space and how much precision you need. For many things the built in precision is fine, when it's not you have other options.

MikeJ
  • 1,299
  • 7
  • 10
1

Like li223 said, integer won't allow you to save values with decimal cases - and the majority of currencies allow decimal values.

I would advise to pick a number of decimal cases to work with, and avoid the problem that you refered ("Why computers are bad at numbers"). I work with invocing and we use 8 decimal cases, and works fine with all currencies so far.

Afonso
  • 323
  • 4
  • 14
0

The main reason to use decimal over integer in this case is that decimal, well, allows decimal places IE: £2.50 can be properly represented as 2.5 with a decimal. Whereas if you used an integer you can't represent decimal points. This is fine if, like John mentions in their comment, you're representing a currency like Japanese Yen that don't have decimals.

As for decimal's accuracy, it still suffers from "Why are computers bad at numbers" see the answer to this question for more info.

li223
  • 369
  • 3
  • 11
  • What do you mean `int` can't represent decimal points? If I were to, basically, use ints to calculate money in, say, cents for US what would stop me to display the int 125 as 1.25 on the frontend? – user13288253 Apr 26 '20 at 14:25
  • Nothing, but you would have to divide that by 100 then output it, instead of just outputting the value as is. You can't have the value of int be 2.5 for instance. – li223 Apr 26 '20 at 14:34
  • I was thinking something more along the lines of a string format, but OK. – user13288253 Apr 26 '20 at 14:40