@ahemadabbas-vagh shared a good and reusable approach. But if we talk about computer science, a string's Boolean equality should not be measured like that.
This approach is the same as his, but it asks Bool
if the value is true or false, not to String
. Bool
should know is a value is true or false, not String
.
var str = "1"
extension Bool {
static func from(stringValue str: String) -> Bool {
return str == "1"
}
}
if Bool.from(stringValue: str) {
print("TRUE")
} else {
print("FALSE")
}
But it is still not correct. In computer science, null/nil or/and an empty string should be false and any other strings should be true. You can add the string "0"
to this set.
You can implement like this; better approaches still might exist.
The idea is; globally a string can only be converted to true
if it is not "0", not "" (empty string), not null/nil or if it is "true".
var str = "1" // check for nil, empty string and zero; these should be false; any other string should be true.
extension Bool {
static func from(stringValue str: String) -> Bool {
if str == "true" {
return true
}
guard let intVal = Int(str) else {
return false
}
return Bool(truncating: intVal as NSNumber)
}
}
if Bool.from(stringValue: str) {
print("TRUE")
} else {
print("FALSE")
}
If you use Decodable
protocol; I know you know how to handle but:
struct aStruct: Decodable {
let boolVal: Bool
enum CodingKeys: String, CodingKey {
case completed
}
init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
let boolString = try container.decode(String.self, forKey: .completed)
self.boolVal = Bool.from(stringValue: boolString)
}
}