120

In UIKit we could use an Extension to set hex color to almost everything, like in this tutorial. But when I'm trying to do it in SwiftUI, it's not possible, it looks like the SwiftUI does not get the UIColor as parameter.

Text(text)
    .color(UIColor.init(hex: "FFF"))

Error message:

Cannot convert value of type 'UIColor' to expected argument type 'Color?'

I even tried to make an extension for Color, instead of UIColor, but I haven't any luck.

My extension for Color:

import SwiftUI

extension Color {
    init(hex: String) {
        let scanner = Scanner(string: hex)
        scanner.scanLocation = 0
        var rgbValue: UInt64 = 0
        scanner.scanHexInt64(&rgbValue)
        
        let r = (rgbValue & 0xff0000) >> 16
        let g = (rgbValue & 0xff00) >> 8
        let b = rgbValue & 0xff
        
        self.init(
            red: CGFloat(r) / 0xff,
            green: CGFloat(g) / 0xff,
            blue: CGFloat(b) / 0xff, alpha: 1
        )
    }
}

Error message:

Incorrect argument labels in call (have 'red:green:blue:alpha:', expected '_:red:green:blue:opacity:')
McKinley
  • 1,123
  • 1
  • 8
  • 18
SinaMN75
  • 6,742
  • 5
  • 28
  • 56
  • The init is this one: https://developer.apple.com/documentation/swiftui/color/3265484-init It's missing a parameter, as you can see it in your error message: `'red:green:blue:alpha:'` vs `'_:red:green:blue:opacity:`, see the `_:` at the start which is for the `_ colorSpace:` and `opacity` vs `alpha`. – Larme Jul 03 '19 at 16:12
  • @Larme yes I tried that, it fixed the compile error, but nothing in result, it does not set the color to the view, did you solve it for yourself? If you do please add the code. – SinaMN75 Jul 03 '19 at 16:22

13 Answers13

176

You're almost there, you were using the wrong initialiser parameter:

extension Color {
    init(hex: String) {
        let hex = hex.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
        var int: UInt64 = 0
        Scanner(string: hex).scanHexInt64(&int)
        let a, r, g, b: UInt64
        switch hex.count {
        case 3: // RGB (12-bit)
            (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
        case 8: // ARGB (32-bit)
            (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
        default:
            (a, r, g, b) = (1, 1, 1, 0)
        }

        self.init(
            .sRGB,
            red: Double(r) / 255,
            green: Double(g) / 255,
            blue:  Double(b) / 255,
            opacity: Double(a) / 255
        )
    }
}
P1xelfehler
  • 1,122
  • 7
  • 19
kontiki
  • 37,663
  • 13
  • 111
  • 125
  • 2
    it solved the compile error, thanks, but it did not set the color the views in SwiftUI, no error but no result – SinaMN75 Jul 03 '19 at 16:26
  • 2
    I tried with Color("ff00ff") and worked fine. What are you passing as hex? – kontiki Jul 03 '19 at 16:27
  • Please also indicate what color do you get for a specific hex parameter. – kontiki Jul 03 '19 at 16:37
  • Your solution doesn't work for '#hexColorStr'. Please use my one: https://stackoverflow.com/questions/36341358/how-to-convert-uicolor-to-string-and-string-to-uicolor-using-swift#answer-62192394 – chatlanin Jun 04 '20 at 10:29
  • This fails: `XCTAssertEqual(Color(hex: "0xFFFFFF"), Color(red: 255, green: 255, blue: 255))`. along with "ffffff" and "FFFFFF" – ScottyBlades Apr 04 '21 at 17:12
  • `XCTAssertEqual(Color(hex: 0xFFFFFF), Color(red: 255 / 255, green: 255 / 255, blue: 255 / 255))` or `XCTAssertEqual(Color(hex: 0xFFFFFF), Color(red: 1, green: 1, blue: 1))` will succeed – Bocaxica Aug 28 '23 at 13:00
83

Another alternative below that uses Int for hex but of course, it can be changed to String if you prefer that.

extension Color {
    init(hex: UInt, alpha: Double = 1) {
        self.init(
            .sRGB,
            red: Double((hex >> 16) & 0xff) / 255,
            green: Double((hex >> 08) & 0xff) / 255,
            blue: Double((hex >> 00) & 0xff) / 255,
            opacity: alpha
        )
    }
}

Usage examples:

Color(hex: 0x000000)
Color(hex: 0x000000, alpha: 0.2)
Sam Soffes
  • 14,831
  • 9
  • 76
  • 80
Tolgahan Arıkan
  • 1,381
  • 1
  • 12
  • 12
  • 1
    That's a good implementation! How would you use String instead of Int? – Frederic Adda Dec 07 '19 at 07:10
  • 2
    For anyone interested, this approach is explained (in a general context not related to SwiftUI) in The Swift Programming Language book, in [Advanced Operators](https://docs.swift.org/swift-book/LanguageGuide/AdvancedOperators.html#). The whole chapter is worth reading. TIP: The key to understanding is the right shift and bitwise AND, and the simplest examples are 1. halving a number using right shift (number >> 1) and 2. checking whether the number odd (number & 0x1 == 1).The [Bitwise_operation](https://en.wikipedia.org/wiki/Bitwise_operation) Wikipedia article is worth reading as well. – Shengchalover Aug 10 '20 at 14:15
  • Why did you create a tuple only to extract all of its values in the very next statement? It doesn't make sense. – Peter Schorn Aug 19 '20 at 17:34
  • @PeterSchorn yeah makes sense, I removed the tuple. Thanks! – Tolgahan Arıkan Aug 26 '20 at 16:29
  • @TolgahanArıkan No problem. Glad I could help. – Peter Schorn Aug 27 '20 at 04:32
  • This test fails: `XCTAssertEqual(Color(hex: 0xffffff), Color(red: 255, green: 255, blue: 255))` – ScottyBlades Apr 04 '21 at 17:10
  • @ScottyBlades this is because Color(red: green: blue:) expects values between 0 and 1, where 0 represents 0x00 and 1 represents 0xFF and anything in between is a fraction of that. Change your test to `XCTAssertEqual(Color(hex: 0xFFFFFF), Color(red: 255 / 255, green: 255 / 255, blue: 255 / 255)) ` or `XCTAssertEqual(Color(hex: 0xFFFFFF), Color(red: 1, green: 1, blue: 1))` and your test will succeed. – Bocaxica Aug 28 '23 at 12:58
44

Try this:

extension Color {
    init(hex: Int, opacity: Double = 1.0) {
        let red = Double((hex & 0xff0000) >> 16) / 255.0
        let green = Double((hex & 0xff00) >> 8) / 255.0
        let blue = Double((hex & 0xff) >> 0) / 255.0
        self.init(.sRGB, red: red, green: green, blue: blue, opacity: opacity)
    }
}

Usage:

Text("Hello World!")
    .background(Color(hex: 0xf5bc53))

Text("Hello World!")
    .background(Color(hex: 0xf5bc53, opacity: 0.8))
McKinley
  • 1,123
  • 1
  • 8
  • 18
41

Here is a Playground with my solution. It adds fallbacks after fallbacks and only relies on the hexString for color and alpha.

import SwiftUI

extension Color {
    init(hex string: String) {
        var string: String = string.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines)
        if string.hasPrefix("#") {
            _ = string.removeFirst()
        }

        // Double the last value if incomplete hex
        if !string.count.isMultiple(of: 2), let last = string.last {
            string.append(last)
        }

        // Fix invalid values
        if string.count > 8 {
            string = String(string.prefix(8))
        }

        // Scanner creation
        let scanner = Scanner(string: string)

        var color: UInt64 = 0
        scanner.scanHexInt64(&color)

        if string.count == 2 {
            let mask = 0xFF

            let g = Int(color) & mask

            let gray = Double(g) / 255.0

            self.init(.sRGB, red: gray, green: gray, blue: gray, opacity: 1)

        } else if string.count == 4 {
            let mask = 0x00FF

            let g = Int(color >> 8) & mask
            let a = Int(color) & mask

            let gray = Double(g) / 255.0
            let alpha = Double(a) / 255.0

            self.init(.sRGB, red: gray, green: gray, blue: gray, opacity: alpha)

        } else if string.count == 6 {
            let mask = 0x0000FF
            let r = Int(color >> 16) & mask
            let g = Int(color >> 8) & mask
            let b = Int(color) & mask

            let red = Double(r) / 255.0
            let green = Double(g) / 255.0
            let blue = Double(b) / 255.0

            self.init(.sRGB, red: red, green: green, blue: blue, opacity: 1)

        } else if string.count == 8 {
            let mask = 0x000000FF
            let r = Int(color >> 24) & mask
            let g = Int(color >> 16) & mask
            let b = Int(color >> 8) & mask
            let a = Int(color) & mask

            let red = Double(r) / 255.0
            let green = Double(g) / 255.0
            let blue = Double(b) / 255.0
            let alpha = Double(a) / 255.0

            self.init(.sRGB, red: red, green: green, blue: blue, opacity: alpha)

        } else {
            self.init(.sRGB, red: 1, green: 1, blue: 1, opacity: 1)
        }
    }
}

let gray0 = Color(hex: "3f")
let gray1 = Color(hex: "#69")
let gray2 = Color(hex: "#6911")
let gray3 = Color(hex: "fff")
let red = Color(hex: "#FF000044s")
let green = Color(hex: "#00FF00")
let blue0 = Color(hex: "0000FF")
let blue1 = Color(hex: "0000F")

For getting the hexString from Color.. well, this is not a public API. We still need to rely on UIColor implementations for that.

PS: I saw the components solution below.. but if the API changes in the future, my version is a bit more stable.

Stefan
  • 762
  • 1
  • 8
  • 11
  • 2
    This is the best answer here. If you add opacity as a param, it will be the most completed one. – FRIDDAY Nov 02 '19 at 07:16
  • 4
    Agreed, I've tried numerous solutions and this is the best answer here. I don't know why it's not upvoted more than that. Props @Stefan ! As for the opacity, just chain it like you would normally do in SwiftUI... Color(hex: "#003366").opacity(0.2) – PsykX Jun 17 '20 at 11:44
14

Best Practice

This Method is the intended way to set custom colours in the app. Once set, the colours are accessible across all views and are easily updated. It is also one line of code.

  1. Open your 'Assets' Folder from the left panel where your project files are located

Color Set Example

  1. Now from within the Assets folder, on the left side there should be a '+' button, click that and select 'Color Set'. Once open, rename it to 'Color1'

Color Set Example

  1. Now click the color square in the middle of the page then from the inspector panel on the right, click 'Show Color Panel' selecting your desired color.

Color Set Example

  1. Now go back to your code where you want to change the text color.

    Text
    .foregroundColor(Color("Color1"))
    
Hao
  • 159
  • 1
  • 4
3

I also used the solution for UIColor by hackingwithswift. This is an adapted version for Color:

init?(hex: String) {
    var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
    hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")

    var rgb: UInt64 = 0

    var red: Double = 0.0
    var green: Double = 0.0
    var blue: Double = 0.0
    var opacity: Double = 1.0

    let length = hexSanitized.count

    guard Scanner(string: hexSanitized).scanHexInt64(&rgb) else { return nil }

    if length == 6 {
        red = Double((rgb & 0xFF0000) >> 16) / 255.0
        green = Double((rgb & 0x00FF00) >> 8) / 255.0
        blue = Double(rgb & 0x0000FF) / 255.0

    } else if length == 8 {
        red = Double((rgb & 0xFF000000) >> 24) / 255.0
        green = Double((rgb & 0x00FF0000) >> 16) / 255.0
        blue = Double((rgb & 0x0000FF00) >> 8) / 255.0
        opacity = Double(rgb & 0x000000FF) / 255.0

    } else {
        return nil
    }

    self.init(.sRGB, red: red, green: green, blue: blue, opacity: opacity)
}
Patrick_K
  • 219
  • 2
  • 11
1

For this task, we have to use bitwise Right Shift >> operator and bitwise AND & operator. Each color channel in a hexadecimal pattern uses 8 bits, or decimal values from 0 to 255, or hexadecimal values from 0x00 to 0xFF.

Bitwise operators are used here to efficiently decompose hexadecimal values ​​by masking and shifting the red component of the color 2 bytes (16 bits) back, and the green component one byte back, respectively. Let's see how it works in Xcode's Live View.

enter image description here

0xFF     == 0x0000FF    // blue
0xFF00   == 0x00FF00    // green
0xFF0077 == 0xFF0077    // pink

Here's the code:

import SwiftUI

extension Color {
    init(_ hexColor: UInt32) {
        self.init(uiColor: .init(
                      red: CGFloat(0xFF & (hexColor >> 0x10)) / 0xFF,
                    green: CGFloat(0xFF & (hexColor >> 0x08)) / 0xFF,
                     blue: CGFloat(0xFF & (hexColor >> 0x00)) / 0xFF,
                    alpha: 1.0))
    }
}

struct ContentView: View {
    
    @State private var hexColor: UInt32 = 0xFF0077    // Deep Pink HEX color
    
    var body: some View {
        ZStack {
            Color.black.ignoresSafeArea()
            Rectangle()
                .frame(width: 300, height: 300)
                .foregroundColor(.init(hexColor))
        }
    }
}
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
1

You can create extension as

import SwiftUI

extension Color {
    init(hex: UInt, alpha: Double = 1) {
        self.init(
            .sRGB,
            red: Double((hex >> 16) & 0xff) / 255,
            green: Double((hex >> 08) & 0xff) / 255,
            blue: Double((hex >> 00) & 0xff) / 255,
            opacity: alpha
        )
    }
}

How to use

 Text("In order to write about life first you must live it")
    .foregroundColor(Color(hex: 0x969696))

use 0x before 6 digit hexa value

Muju
  • 884
  • 20
  • 54
0

Usage
UIColor.init(hex: "f2000000")
UIColor.init(hex: "#f2000000")
UIColor.init(hex: "000000")
UIColor.init(hex: "#000000")

extension UIColor {
public convenience init(hex:String) {
var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
    
    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }
    var r: CGFloat = 0.0
    var g: CGFloat = 0.0
    var b: CGFloat = 0.0
    var a: CGFloat = 1.0
    
    var rgbValue:UInt64 = 0
    Scanner(string: cString).scanHexInt64(&rgbValue)
    
    if ((cString.count) == 8) {
        r = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
        g =  CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0
        b = CGFloat((rgbValue & 0x0000FF)) / 255.0
        a = CGFloat((rgbValue & 0xFF000000)  >> 24) / 255.0
        
    }else if ((cString.count) == 6){
        r = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
        g =  CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0
        b = CGFloat((rgbValue & 0x0000FF)) / 255.0
        a =  CGFloat(1.0)
    }
    
    
    self.init(  red: r,
                green: g,
                blue: b,
                alpha: a
    )
} }
MMK
  • 371
  • 6
  • 10
0

Usage:

Color(hex: "#FFFFFF") // hashtag + uppercase value
Color(hex: "#ffffff") // hashtag + lowercase value
Color(hex: "FFFFFF") // without hashtag
Color(hex: "FFFFFF", alpha: 0.2) // color value + alpha value
Color(hex: "#0080FF80") // long color & alpha value

My solution is based on the code for UIColor, which has performed well in the production environment. The original code on github

extension Color {

    init(hex: String?, alpha: CGFloat? = nil) {
        let normalizedHexString: String = Color.normalize(hex)
        var ccc: CUnsignedLongLong = 0
        Scanner(string: normalizedHexString).scanHexInt64(&ccc)
        var resultAlpha: CGFloat {
            switch alpha {
            case nil: return ColorMasks.alphaValue(ccc)
            default: return alpha!
            }
        }
        self.init(CGColor(red: ColorMasks.redValue(ccc),
                          green: ColorMasks.greenValue(ccc),
                          blue: ColorMasks.blueValue(ccc),
                          alpha: resultAlpha))
    }

    func hexDescription(_ includeAlpha: Bool = false) -> String {
        guard let cgColor = self.cgColor else {
            return "Problem with cgColor"
        }
        guard cgColor.numberOfComponents == 4 else {
            return "Color not RGB."
        }
        guard let components = cgColor.components else {
            return "Problem with cgColor.components"
        }
        let aaa = components.map({ Int($0 * CGFloat(255)) })
        let color = String.init(format: "%02x%02x%02x", aaa[0], aaa[1], aaa[2])
        if includeAlpha {
            let alpha = String.init(format: "%02x", aaa[3])
            return "\(color)\(alpha)"
        }
        return color
    }

    fileprivate enum ColorMasks: CUnsignedLongLong {
        case redMask    = 0xff000000
        case greenMask  = 0x00ff0000
        case blueMask   = 0x0000ff00
        case alphaMask  = 0x000000ff

        static func redValue(_ value: CUnsignedLongLong) -> CGFloat {
            return CGFloat((value & redMask.rawValue) >> 24) / 255.0
        }

        static func greenValue(_ value: CUnsignedLongLong) -> CGFloat {
            return CGFloat((value & greenMask.rawValue) >> 16) / 255.0
        }

        static func blueValue(_ value: CUnsignedLongLong) -> CGFloat {
            return CGFloat((value & blueMask.rawValue) >> 8) / 255.0
        }

        static func alphaValue(_ value: CUnsignedLongLong) -> CGFloat {
            return CGFloat(value & alphaMask.rawValue) / 255.0
        }
    }

    fileprivate static func normalize(_ hex: String?) -> String {
        guard var hexString = hex else {
            return "00000000"
        }
        if hexString.hasPrefix("#") {
            hexString = String(hexString.dropFirst())
        }
        if hexString.count == 3 || hexString.count == 4 {
            hexString = hexString.map { "\($0)\($0)" } .joined()
        }
        let hasAlpha = hexString.count > 7
        if !hasAlpha {
            hexString += "ff"
        }
        return hexString
    }
}
Leon Jakonda
  • 83
  • 1
  • 6
0

Head over to "Assers.xcassets" and add a new "Color Set", Click on "Any Apperance - for light mode" or "Dark - for dark mode" and set "Input Method" to 8-bit Hexadecimal and set your HEX code in the hex code. Optionally, you could set a name to this custom color, via the Name field (ex: myCustomColor)and then refer to this color in the UI as "Color(customPurpleColor)"

Kyoto
  • 308
  • 4
  • 4
-1

You can use this extension for UIColor

extension UIColor {
    convenience init(hexaString: String, alpha: CGFloat = 1) {
        let chars = Array(hexaString.dropFirst())
        self.init(red:   .init(strtoul(String(chars[0...1]),nil,16))/255,
                  green: .init(strtoul(String(chars[2...3]),nil,16))/255,
                  blue:  .init(strtoul(String(chars[4...5]),nil,16))/255,
                  alpha: alpha)}
}

Usage Example:

let lightGoldColor  = UIColor(hexaString: "#D6CDB2")

Test Code:

enter image description here

Resource

Sawsan
  • 1,096
  • 10
  • 20
-3

SwiftUI Color creation from hex (3, 4, 6, 8 characters) support for #, alpha, web constants, and UIColor constants. Usage examples below.

Swift Package iOS 14+ includes support for Color hex, random, CSS colors, and UserDefaults.

enter image description here

Norman
  • 3,020
  • 22
  • 21
  • 1
    I don't see `Color(hex:` in the docs nor in code completion. – ScottyBlades Apr 04 '21 at 17:17
  • @ScottyBlades Sorry about that. If you're using iOS 14 here's a package that will provide Color support for hex and UserDefaults as well. https://github.com/nbasham/BlackLabsSwiftUIColor – Norman Apr 05 '21 at 18:33