I'm developing an application that relies heavily on Joda-Money, and have a number of unit tests that verify my business logic. One (admittedly minor) sticking point for me has been what sort of Money
/BigMoney
objects to test with; specifically, what CurrencyUnit
to use.
As I see it, I have a couple of options:
Just use
USD
This is clearly the easiest way to go, and most of my actual application will be working with US Dollars so it makes a fair bit of sense. On the other hand, it feels rather US-centric, and I'm concerned it would risk letting currency-specific errors go unchecked.
Use another real currency, like
CAD
This would catch erroneous hard-codings of
USD
, but otherwise isn't much better than just usingUSD
.Use a dedicated "fake" currency, i.e.
XTS
This clearly makes sense, after all,
XTS
is "reserved for use in testing". But Joda denotes psuedo-currencies as currencies with-1
decimal places. In practice, the primary difference between currencies in Joda-Money is the number of decimal places, so this risks masking any errors involving decimal place precision, such as erroneously rounding to an integer value.Register my own custom currency with
CurrencyUnit.registerCurrency()
This would obviously work, but seems a little odd seeing as there are alternatives.
Use a
CurrencyUnit
instance created by a mocking libraryPretty much the same as registering a custom currency.
Again, this is obviously a minor issue, but I'm curious if there's a standard practice for cases like this, or if there's a clear reason to prefer one of these options in particular.