I know that the consensus is "money = decimal". I understand all the technical fine point of the argument, and agree with that.
Where I still have issue is that it's seems to assume that all finance applications are simple ledger...
When you start to play with ratio, interpolation (cubic spline for example) or if you move to some Stochastic model, differentiate, integral... it seems that the decimal become meaningless.
So my question is where is the border (mathematical complexity?) when using decimal become more a burden that an advantage?
EDIT: most of complex financial product use statistical model in order to be priced. In the end the price is a $$, and as Flydog57 point out, most (if not all) mathematical library used double as input/output.
So unless I redeveloped all these mathematical functions (that has been develops by people way smarter than me) I need to constantly cast decimal to double and conversely. That's where the problem is.
Is it useful to use decimal "because it's money" if in the end I constantly need to cast it in double to be able to create a price?
That where the argument saying (money = decimal) may be a simplistic view of a real world problem. I know the question seems to be duplicate but all the other answer out there don't deal with this specify case where a price (money) is actually created from a statistical model