So, you want to natural-language-parse a string, and generate a floating-point number from it?
Well, the extension is the easy part. Just create a failable initializer for it:
let digits = [
"zero", "one", "two", "three",
"four", "five", "six", "seven",
"eight", "nine",
]
extension Double {
init?(fromEnglishString s: String) {
if let digit = find(digits, s) {
self.init(Double(digit))
}
else {
return nil
}
}
}
let d = Double(fromEnglishString: "one")
// d is {Some 1.0}
The hard part is going to be finding a good parser for all the ways you can express numbers in English (especially floating-point numbers). That's much more tricky. You might find this more language-agnostic answer interesting.
You could also write a StringLiteralConvertible
extension for it. However, this is only for when you are initializing your value directly from a string literal at compile time – which would be a bit pointless, I mean, do you really need word-based number literals in your source code? The other problem is literal convertible initializers can't be failable, so you'll be stuck with returning a default value (maybe NaN
?) if the string can't be parsed.
Nevertheless, if you really want one:
extension Double: StringLiteralConvertible {
public typealias StringLiteralType = String
public typealias UnicodeScalarLiteralType = String
public typealias ExtendedGraphemeClusterLiteralType = String
public init(unicodeScalarLiteral value: UnicodeScalarLiteralType) {
self.init(stringLiteral: value)
}
public init(extendedGraphemeClusterLiteral value: ExtendedGraphemeClusterLiteralType) {
self.init(stringLiteral: value)
}
public init(stringLiteral value: String) {
if let d = Double(fromEnglishString: value) {
self = d
} else {
self = 0.0
}
}
}
let doubleFromLiteral: Double = "three"
// doubleFromLiteral is {Some 3.0}