3

This is what I do now:

extension Decimal {
    var formattedAmount: String {
        let formatter = NumberFormatter()
        formatter.generatesDecimalNumbers = true
        formatter.minimumFractionDigits = 2
        formatter.maximumFractionDigits = 2
        return formatter.string(from: self) //mismatch type
    }
}

but I cannot create NSNumber from Decimal.

Rashwan L
  • 38,237
  • 7
  • 103
  • 107
Bartłomiej Semańczyk
  • 59,234
  • 49
  • 233
  • 358
  • 2
    `string(for: self)` – compare https://stackoverflow.com/a/29999137/1187415 – Martin R Oct 25 '17 at 13:09
  • 3
    Or `string(from: self as NSDecimalNumber)` – compare https://stackoverflow.com/q/41148782/1187415 – Martin R Oct 25 '17 at 13:12
  • 1
    @MartinR I didn't noticed the existence of `string(for: self)` is it documented?, I was aiming to add an answer as mentioned in your second comment, it's late for now :) – Ahmad F Oct 25 '17 at 13:19
  • 1
    It is documented, as a method of the "abstract" superclass [`Formatter`](https://developer.apple.com/documentation/foundation/formatter) of `NumberFormatter`. – Martin R Oct 25 '17 at 13:21

2 Answers2

7

This should work

extension Decimal {
    var formattedAmount: String? {
        let formatter = NumberFormatter()
        formatter.generatesDecimalNumbers = true
        formatter.minimumFractionDigits = 2
        formatter.maximumFractionDigits = 2
        return formatter.string(from: self as NSDecimalNumber)
    }
}
Ladislav
  • 7,223
  • 5
  • 27
  • 31
  • No need to use generatesDecimalNumbers property when converting from Decimal to String. Use it in case of String to Decimal – Roman Apr 24 '21 at 15:49
2

This:

formatter.string(from: ...) // requires a NSNumber

You can either do this:

formatter.string(for: self) // which takes Any as argument

Or:

string(from: self as NSDecimalNumber) // cast self to NSDecimalNumber
Rashwan L
  • 38,237
  • 7
  • 103
  • 107