1

I'm trying to make UIFont conform to Decodable, but I'm having a hard time.

I currently have solution where im wrapping the UIFont in a Font struct like this

public struct Font: Decodable{
    let font: UIFont

    private enum CodingKeys: String, CodingKey {
        case size
        case font
    }

    public init(from decoder: Decoder){
        do{
            let values = try decoder.container(keyedBy: CodingKeys.self)
            font = UIFont(name: try values.decode(String.self, forKey: .font), size: try values.decode(CGFloat.self, forKey: .size))!
        } catch{
            fatalError("Font configuration error:\(error)")
        }
    }
}

This works but I seems clumsy, so I was trying like this instead:

final class Font: UIFont, Decodable{
    private enum CodingKeys: String, CodingKey {
        case size
        case font
    }

    convenience init(from decoder: Decoder)  {
        do{
            let values = try decoder.container(keyedBy: CodingKeys.self)
            super.init(name: try values.decode(String.self, forKey: .font), size: try values.decode(CGFloat.self, forKey: .size))
        } catch{
            fatalError("Font configuration error:\(error)")
        }
    }
}

This however do not work because init(from decoder: Decoder) can not be a failable initialiser and UIFont.init(name: String, size: CGFloat) is a failable initialiser, and calling a failable init from a non failable one is not possible.

Any suggestions to how to make UIFont conform to Decodable without wrapping it is highly appreciated.

iCediCe
  • 1,672
  • 1
  • 15
  • 32
  • 1
    That is indeed a puzzle. There's a generic answer here https://stackoverflow.com/questions/26440263/swift-cannot-assign-to-self-in-a-method but that requires us to be able to set the properties *after* a generic `init()`, which is no good for `UIFont`, as we can't set size &c. directly. Perhaps `UIFontDescriptor` as `Decodable` might be the answer, rather than `UIFont`? – Grimxn May 29 '19 at 14:03
  • "`init(from decoder: Decoder)` can not be a failable initialiser" — quite opposite, it should be failable (or to be precise: it should be able to throw an error). – user28434'mstep May 29 '19 at 15:35

1 Answers1

2

I'm writing at cell phone so I can't try this snippet but I think this can work. Let me know if it helps.

final class Font: Codable {
let size: CGFloat
let name: String

var font: UIFont = UIFont.init()

init(s: CGFloat, n: String) {
    size = s
    name = n

    font = UIFont(name: name, size: size) ?? UIFont.systemFont(ofSize: size)
}

enum CodingKeys: String, CodingKey {
    case size
    case name
}

required init(from decoder: Decoder) throws {
    let container = try decoder.container(keyedBy: CodingKeys.self)

    size = try container.decode(CGFloat.self, forKey: .size)
    name = try container.decode(String.self, forKey: .name)

    font = UIFont(name: name, size: size) ?? UIFont.systemFont(ofSize: size)
}

func encode(to encoder: Encoder) throws {
    var container = encoder.container(keyedBy: CodingKeys.self)

    try container.encode(size, forKey: .size)
    try container.encode(name, forKey: .name)
}
}
Alastar
  • 1,284
  • 1
  • 8
  • 14
  • 1
    That looks like solution #1 from the question, that works but is not desirable. – user28434'mstep May 29 '19 at 15:37
  • Like stated above this will work but it's still just a wrapper around UIFont which is what I want to avoid. Mainly because I want to avoid having to write something like Configuration.Fonts.Main.Font when using it.... – iCediCe May 29 '19 at 19:45