I want to use PHP's Intl's NumberFormatter class to display prices in a human-readable format. What our project needs:
- The CLDR number pattern, and the currency and separator symbols will need to be configured through our code and not default to what Intl/ICU knows.
- Our application will take care of the decimals. NumberFormatter should display any decimals that we pass on to it.
However, when playing around with different configurations to find the exact combination that works for our project, I noticed some effects that I can't explain. The three formatters in the following code snippet are almost identical. As opposed to the first one, the second one uses the euro instead of the U.S. dollar, and the third one has a currency sign set. The output of the first formatter is as I expected it to be, but when I change the currency or set a currency sign, the MIN_FRACTION_DIGITS attribute is ignored and the sign is never changed.
<?php
$fmt = new NumberFormatter('de_DE', NumberFormatter::CURRENCY);
$fmt->setAttribute(NumberFormatter::MIN_FRACTION_DIGITS, 4);
echo $fmt->formatCurrency(1234567890.891234567890000, "EUR")."\n";
// Outputs 1.234.567.890,8912 €
$fmt = new NumberFormatter('de_DE', NumberFormatter::CURRENCY);
$fmt->setAttribute(NumberFormatter::MIN_FRACTION_DIGITS, 4);
echo $fmt->formatCurrency(1234567890.891234567890000, "USD")."\n";
// Ouputs 1.234.567.890,89 $
$fmt = new NumberFormatter('de_DE', NumberFormatter::CURRENCY);
$fmt->setAttribute(NumberFormatter::MIN_FRACTION_DIGITS, 4);
$fmt->setSymbol(\NumberFormatter::CURRENCY_SYMBOL, '%');
echo $fmt->formatCurrency(1234567890.891234567890000, "EUR")."\n";
// Outputs 1.234.567.890,89 €
?>
The first table row under General Purpose Numbers of the Unicode CLDR number pattern documentation describes that when parsing currency patterns, the two zeroes in the decimal part of the pattern will need to be replaced by however many digits the application thinks is appropriate. The application here is ICU (the C library that PHP uses for this), and the MIN_FRACTION_DIGITS attribute does its job of letting me override default behavior in the first example, but not in the second or the third.
Can someone please explain this seemingly random change in behavior? Let me know if there is any additional information that you need.