440

I am trying to use hex color values in Swift, instead of the few standard ones that UIColor allows you to use, but I have no idea how to do it.

Example: how would I use #ffffff as a color?

gi097
  • 7,313
  • 3
  • 27
  • 49
Stephen Fox
  • 14,190
  • 19
  • 48
  • 52
  • 1
    `#ffffff` are actually 3 color components in hexadecimal notation - red `ff`, green `ff` and blue `ff`. You can write hexadecimal notation in Swift using `0x` prefix, e.g `0xFF`. – Sulthan Jun 17 '14 at 11:57
  • 1
    The answer here should help you out https://stackoverflow.com/questions/24196528/swift-uicolor-initializer-compiler-error-only-when-targeting-iphone5s. – BooRanger Jun 17 '14 at 11:58
  • 1
    possible duplicate of [How to use UIColorFromRGB value in Swift](http://stackoverflow.com/questions/24074257/how-to-use-uicolorfromrgb-value-in-swift) – Jack Jun 17 '14 at 14:56
  • This one i use there are many like this website www.hextoswift.com – amar Aug 14 '21 at 06:11

40 Answers40

782

#ffffff are actually 3 color components in hexadecimal notation - red ff, green ff and blue ff. You can write hexadecimal notation in Swift using 0x prefix, e.g 0xFF

To simplify the conversion, let's create an initializer that takes integer (0 - 255) values:

extension UIColor {
   convenience init(red: Int, green: Int, blue: Int) {
       assert(red >= 0 && red <= 255, "Invalid red component")
       assert(green >= 0 && green <= 255, "Invalid green component")
       assert(blue >= 0 && blue <= 255, "Invalid blue component")

       self.init(red: CGFloat(red) / 255.0, green: CGFloat(green) / 255.0, blue: CGFloat(blue) / 255.0, alpha: 1.0)
   }

   convenience init(rgb: Int) {
       self.init(
           red: (rgb >> 16) & 0xFF,
           green: (rgb >> 8) & 0xFF,
           blue: rgb & 0xFF
       )
   }
}

Usage:

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF)
let color2 = UIColor(rgb: 0xFFFFFF)

How to get alpha?

Depending on your use case, you can simply use the native UIColor.withAlphaComponent method, e.g.

let semitransparentBlack = UIColor(rgb: 0x000000).withAlphaComponent(0.5)

Or you can add an additional (optional) parameter to the above methods:

convenience init(red: Int, green: Int, blue: Int, a: CGFloat = 1.0) {
    self.init(
        red: CGFloat(red) / 255.0,
        green: CGFloat(green) / 255.0,
        blue: CGFloat(blue) / 255.0,
        alpha: a
    )
}

convenience init(rgb: Int, a: CGFloat = 1.0) {
    self.init(
        red: (rgb >> 16) & 0xFF,
        green: (rgb >> 8) & 0xFF,
        blue: rgb & 0xFF,
        a: a
    )
}

(we cannot name the parameter alpha because of a name collision with the existing initializer).

Called as:

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0.5)
let color2 = UIColor(rgb: 0xFFFFFF, a: 0.5)

To get the alpha as an integer 0-255, we can

convenience init(red: Int, green: Int, blue: Int, a: Int = 0xFF) {
    self.init(
        red: CGFloat(red) / 255.0,
        green: CGFloat(green) / 255.0,
        blue: CGFloat(blue) / 255.0,
        alpha: CGFloat(a) / 255.0
    )
}

// let's suppose alpha is the first component (ARGB)
convenience init(argb: Int) {
    self.init(
        red: (argb >> 16) & 0xFF,
        green: (argb >> 8) & 0xFF,
        blue: argb & 0xFF,
        a: (argb >> 24) & 0xFF
    )
}

Called as

let color = UIColor(red: 0xFF, green: 0xFF, blue: 0xFF, a: 0xFF)
let color2 = UIColor(argb: 0xFFFFFFFF)

Or a combination of the previous methods. There is absolutely no need to use strings.

Sulthan
  • 128,090
  • 22
  • 218
  • 270
  • @Sulthan Could you please also define an extension with alpha and hex value? – Michael Apr 25 '15 at 12:13
  • 1
    @confile No, because that's not standardized. Alpha can be the first component or the last. If you need alpha, just add one parameter `alpha` – Sulthan Apr 25 '15 at 13:20
  • @Sulthan Can I set alpha after the color has been created? – Michael Apr 25 '15 at 13:24
  • @confile Colors are immutable, so no, you have to create a new color with changed alpha. – Sulthan May 06 '15 at 20:35
  • 4
    Same solution, Swift 1.2 compatible, with alpha support: https://gist.github.com/berikv/ecf1f79c5bc9921c47ef – Berik May 21 '15 at 12:43
  • 3 drops of albine unicorn tears, a dash of meteorite dust tossed into the magma of a millenial eruption... Still easier than setting a color hahah, anyways it works! – Josh May 27 '15 at 16:06
  • assert(red >= 0 ... ) - why not use UInt and let the compiler take care of that for you? – jrc Aug 16 '15 at 19:02
  • 1
    For explanation of how this works, see [this](https://developer.apple.com/library/prerelease/ios/documentation/Swift/Conceptual/Swift_Programming_Language/AdvancedOperators.html) Apple Doc – Jacob R Nov 15 '15 at 15:42
  • 2
    Why not use UInt8 instead of asserting that your ints are in range 0...255? – Richard Venable Mar 21 '16 at 13:59
  • 1
    @jrc @RichardVenable Apple recommends to use `Int` even when only unsigned values are expected. Using one type simplifies operations, especially when we are working in a language without implicit casts. – Sulthan Apr 10 '17 at 12:35
  • Nice, but too lengthy, Here I have written a short method for this purpose https://handyopinion.com/method-to-convert-hexadecimal-color-code-into-uicolor-swift/ – Asad Ali Choudhry Mar 22 '20 at 15:21
  • @AsadAliChoudhry Lengthy? Technically it's a one-liner. You are parsing a string which is rarely needed and the same is written here in 5 answers already. – Sulthan Mar 22 '20 at 15:48
  • Great solution, I added *UIColor.init(rgb: Constants.Color.mainColor).cgColor* personally – ΩlostA May 07 '20 at 15:28
427

This is a function that takes a hex string and returns a UIColor.
(You can enter hex strings with either format: #ffffff or ffffff)

Usage:

var color1 = hexStringToUIColor("#d3d3d3")

Swift 5: (Swift 4+)

func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }

    if ((cString.count) != 6) {
        return UIColor.gray
    }

    var rgbValue:UInt64 = 0
    Scanner(string: cString).scanHexInt64(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}

Swift 3:

func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }

    if ((cString.characters.count) != 6) {
        return UIColor.gray
    }

    var rgbValue:UInt32 = 0
    Scanner(string: cString).scanHexInt32(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}

Swift 2:

func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet() as NSCharacterSet).uppercaseString

    if (cString.hasPrefix("#")) {
      cString = cString.substringFromIndex(cString.startIndex.advancedBy(1))
    }

    if ((cString.characters.count) != 6) {
      return UIColor.grayColor()
    }

    var rgbValue:UInt32 = 0
    NSScanner(string: cString).scanHexInt(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}



Source: arshad/gist:de147c42d7b3063ef7bc

Edit: Updated the code. Thanks, Hlung, jaytrixz, Ahmad F, Kegham K, and Adam Waite!

Ethan Strider
  • 7,849
  • 3
  • 24
  • 29
  • `countelements` is now just `count` :) – Hlung Apr 06 '15 at 16:24
  • @Hlung and @ethanstrider it looks like they don't even let you do `count` now instead of `countElements`, any idea what they want us to use? – SRMR Aug 30 '15 at 00:49
  • Before Swift 2.0, count was a global function. As of Swift 2.0, it can be called as a member function (i.e. aString.count). – Ethan Strider Sep 01 '15 at 16:22
  • 2
    Changed this line of code `cString = cString.substringFromIndex(advance(cString.startIndex, 1))` to `cString = cString.substringFromIndex(cString.startIndex.advancedBy(1))` for Swift 2.2 Xcode 7.3 – jaytrixz Apr 02 '16 at 07:29
  • And all you have to do for *Swift 4* is to remove `characters` from `cString.characters.count` :) – Ahmad F Dec 18 '17 at 21:23
  • 1
    If you are still supporting iPhone 5 or any 32 bit devices prior to iOS 11 it will crash. You need to change the `UInt32` to `UInt64` – Kegham K. Feb 23 '19 at 23:32
  • 1
    Xcode 11 and the iOS13 SDK deprecates `scanHexInt32`. Use a `UInt64` and `scanHexInt64` instead. – Adam Waite Aug 08 '19 at 05:58
  • This is perfect! – riciloma Feb 03 '20 at 16:05
247

Swift 5 (Swift 4, Swift 3) UIColor extension:

extension UIColor {
    convenience init(hexString: String) {
        let hex = hexString.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
        var int = UInt64()
        Scanner(string: hex).scanHexInt64(&int)
        let a, r, g, b: UInt64
        switch hex.count {
        case 3: // RGB (12-bit)
            (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
        case 8: // ARGB (32-bit)
            (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
        default:
            (a, r, g, b) = (255, 0, 0, 0)
        }
        self.init(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue: CGFloat(b) / 255, alpha: CGFloat(a) / 255)
    }
}

Usage:

let darkGrey = UIColor(hexString: "#757575")

Swift 2.x version:

extension UIColor {
    convenience init(hexString: String) {
        let hex = hexString.stringByTrimmingCharactersInSet(NSCharacterSet.alphanumericCharacterSet().invertedSet)
        var int = UInt32()
        NSScanner(string: hex).scanHexInt(&int)
        let a, r, g, b: UInt32
        switch hex.characters.count {
        case 3: // RGB (12-bit)
            (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
        case 8: // ARGB (32-bit)
            (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
        default:
            (a, r, g, b) = (255, 0, 0, 0)
        }
        self.init(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue: CGFloat(b) / 255, alpha: CGFloat(a) / 255)
    }
}
Luca Torella
  • 7,974
  • 4
  • 38
  • 48
  • 4
    This is my favorite implementation because of the way it handles the 3 cases. But I prefer the default: case to return nil, instead of white. – Richard Venable Mar 21 '16 at 13:54
  • by the way, the default case in this implementation seems to be UIColor.yellow() – Gui Moura Apr 15 '16 at 19:33
  • I know how to use it and it works like a charm. But I don't really understand why. Maybe someone can give me an explanation or some good links/words to search? – kuzdu Nov 24 '16 at 09:50
  • 4
    This does not work correctly with alpha values. e.g. Both inputs "ff00ff00" and "#ff00ff00" will output an RGBA of 0 1 0 1. (It should be 1 0 1 0). The input "#ff00ff" results in 1 0 1 1, which is correct. (Xcode 8.2.1, iOS 9.3.) – Womble Jan 06 '17 at 03:39
  • 3
    @Womble the first component is the alpha not the last one. So "#ff00ff00" has alpha 1 because of the "ff" at the beginning. I think you meant "#00ff00ff". Another example: "#ff00ff00" this is green with alpha 1, "#0000ff00" this is green with alpha 0 – Luca Torella Jan 06 '17 at 10:09
  • 'scanHexInt32' was deprecated in iOS 13.0 – Peter Lapisu Nov 13 '19 at 17:57
  • @PeterLapisu true, I updated the snippet, now we use UInt64 – Luca Torella Nov 14 '19 at 16:04
  • Hey can someone please explain why in the case of 12-bit we need to multiple 17? – Nhat Dinh Jul 06 '21 at 08:46
  • nvm, I got this, because 12-bits HEX is Shorthand hexadecimal form. – Nhat Dinh Jul 06 '21 at 08:57
  • https://www.hackingwithswift.com/example-code/uicolor/how-to-convert-a-hex-color-to-a-uicolor – Carmen Jun 04 '22 at 09:41
92

UIColor:

extension UIColor {

    convenience init(hex: Int) {
        let components = (
            R: CGFloat((hex >> 16) & 0xff) / 255,
            G: CGFloat((hex >> 08) & 0xff) / 255,
            B: CGFloat((hex >> 00) & 0xff) / 255
        )
        self.init(red: components.R, green: components.G, blue: components.B, alpha: 1)
    }

}

CGColor:

extension CGColor {

    class func colorWithHex(hex: Int) -> CGColorRef {

        return UIColor(hex: hex).CGColor

    }

}

Usage

let purple = UIColor(hex: 0xAB47BC)
Michael
  • 9,639
  • 3
  • 64
  • 69
Rudolf Adamkovič
  • 31,030
  • 13
  • 103
  • 118
47

I've merged a few ideas from this thread of answers and updated it for iOS 13 & Swift 5.

extension UIColor {
  
  convenience init(_ hex: String, alpha: CGFloat = 1.0) {
    var cString = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
    
    if cString.hasPrefix("#") { cString.removeFirst() }
    
    if cString.count != 6 {
      self.init("ff0000") // return red color for wrong hex input
      return
    }
    
    var rgbValue: UInt64 = 0
    Scanner(string: cString).scanHexInt64(&rgbValue)
    
    self.init(red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
              green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
              blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
              alpha: alpha)
  }

}

You can then use it like this:

UIColor("#ff0000") // with #
UIColor("ff0000")  // without #
UIColor("ff0000", alpha: 0.5) // using optional alpha value
budiDino
  • 13,044
  • 8
  • 95
  • 91
46

Swift 4 : Combining the answers of Sulthan and Luca Torella :

extension UIColor {
    convenience init(hexFromString:String, alpha:CGFloat = 1.0) {
        var cString:String = hexFromString.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
        var rgbValue:UInt32 = 10066329 //color #999999 if string has wrong format

        if (cString.hasPrefix("#")) {
            cString.remove(at: cString.startIndex)
        }

        if ((cString.count) == 6) {
            Scanner(string: cString).scanHexInt32(&rgbValue)
        }

        self.init(
            red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
            green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
            blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
            alpha: alpha
        )
    }
}

Usage examples:

let myColor = UIColor(hexFromString: "4F9BF5")

let myColor = UIColor(hexFromString: "#4F9BF5")

let myColor = UIColor(hexFromString: "#4F9BF5", alpha: 0.5)
EduardoInTech
  • 593
  • 1
  • 6
  • 9
35

Swift 5.3 & SwiftUI: Hex and CSS color name support via a UIColor

Gist code

Swift Package

SwiftUI Package

Example strings:

  • Orange, Lime, Tomato, etc.
  • Clear, Transparent, nil, and empty string yield [UIColor clearColor]
  • abc
  • abc7
  • #abc7
  • 00FFFF
  • #00FFFF
  • 00FFFF77

Playground output: Playground output

Norman
  • 3,020
  • 22
  • 21
27

The simplest way to add color programmatically is by using ColorLiteral.

Just add the property ColorLiteral as shown in the example, Xcode will prompt you with a whole list of colors which you can choose. The advantage of doing so is lesser code, add HEX values or RGB. You will also get the recently used colors from the storyboard.

Example: self.view.backgroundColor = ColorLiteral enter image description here

puneeth
  • 743
  • 7
  • 8
24

With Swift 2.0 and Xcode 7.0.1 you can create this function:

    // Creates a UIColor from a Hex string.
    func colorWithHexString (hex:String) -> UIColor {
        var cString:String = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString

        if (cString.hasPrefix("#")) {
            cString = (cString as NSString).substringFromIndex(1)
        }

        if (cString.characters.count != 6) {
            return UIColor.grayColor()
        }

        let rString = (cString as NSString).substringToIndex(2)
        let gString = ((cString as NSString).substringFromIndex(2) as NSString).substringToIndex(2)
        let bString = ((cString as NSString).substringFromIndex(4) as NSString).substringToIndex(2)

        var r:CUnsignedInt = 0, g:CUnsignedInt = 0, b:CUnsignedInt = 0;
        NSScanner(string: rString).scanHexInt(&r)
        NSScanner(string: gString).scanHexInt(&g)
        NSScanner(string: bString).scanHexInt(&b)


        return UIColor(red: CGFloat(r) / 255.0, green: CGFloat(g) / 255.0, blue: CGFloat(b) / 255.0, alpha: CGFloat(1))
    }

and then use it in this way:

let color1 = colorWithHexString("#1F437C")

Updated For Swift 4

func colorWithHexString (hex:String) -> UIColor {

    var cString = hex.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString = (cString as NSString).substring(from: 1)
    }

    if (cString.characters.count != 6) {
        return UIColor.gray
    }

    let rString = (cString as NSString).substring(to: 2)
    let gString = ((cString as NSString).substring(from: 2) as NSString).substring(to: 2)
    let bString = ((cString as NSString).substring(from: 4) as NSString).substring(to: 2)

    var r:CUnsignedInt = 0, g:CUnsignedInt = 0, b:CUnsignedInt = 0;
    Scanner(string: rString).scanHexInt32(&r)
    Scanner(string: gString).scanHexInt32(&g)
    Scanner(string: bString).scanHexInt32(&b)


    return UIColor(red: CGFloat(r) / 255.0, green: CGFloat(g) / 255.0, blue: CGFloat(b) / 255.0, alpha: CGFloat(1))
}
Muneeb Ali
  • 472
  • 6
  • 14
Jorge Casariego
  • 21,948
  • 6
  • 90
  • 97
18

Here's what I'm using. Works with 6 and 8 character color strings, with or without the # symbol. Defaults to black in release and crashes in debug when initialized with an invalid string.

extension UIColor {
    public convenience init(hex: String) {
        var r: CGFloat = 0
        var g: CGFloat = 0
        var b: CGFloat = 0
        var a: CGFloat = 1

        let hexColor = hex.replacingOccurrences(of: "#", with: "")
        let scanner = Scanner(string: hexColor)
        var hexNumber: UInt64 = 0
        var valid = false

        if scanner.scanHexInt64(&hexNumber) {
            if hexColor.count == 8 {
                r = CGFloat((hexNumber & 0xff000000) >> 24) / 255
                g = CGFloat((hexNumber & 0x00ff0000) >> 16) / 255
                b = CGFloat((hexNumber & 0x0000ff00) >> 8) / 255
                a = CGFloat(hexNumber & 0x000000ff) / 255
                valid = true
            }
            else if hexColor.count == 6 {
                r = CGFloat((hexNumber & 0xff0000) >> 16) / 255
                g = CGFloat((hexNumber & 0x00ff00) >> 8) / 255
                b = CGFloat(hexNumber & 0x0000ff) / 255
                valid = true
            }
        }

        #if DEBUG
            assert(valid, "UIColor initialized with invalid hex string")
        #endif

        self.init(red: r, green: g, blue: b, alpha: a)
    }
}

Usage:

UIColor(hex: "#75CC83FF")
UIColor(hex: "75CC83FF")
UIColor(hex: "#75CC83")
UIColor(hex: "75CC83")
Erik
  • 2,138
  • 18
  • 18
17

Warning "'scanHexInt32' was deprecated in iOS 13.0" was fixed.

The sample should work on Swift2.2 and above(Swift2.x, Swift3.x, Swift4.x, Swift5.x):

extension UIColor {

    // hex sample: 0xf43737
    convenience init(_ hex: Int, alpha: Double = 1.0) {
        self.init(red: CGFloat((hex >> 16) & 0xFF) / 255.0, green: CGFloat((hex >> 8) & 0xFF) / 255.0, blue: CGFloat((hex) & 0xFF) / 255.0, alpha: CGFloat(255 * alpha) / 255)
    }

    convenience init(_ hexString: String, alpha: Double = 1.0) {
        let hex = hexString.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
        var int = UInt64()
        Scanner(string: hex).scanHexInt64(&int)

        let r, g, b: UInt64
        switch hex.count {
        case 3: // RGB (12-bit)
            (r, g, b) = ((int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (r, g, b) = (int >> 16, int >> 8 & 0xFF, int & 0xFF)
        default:
            (r, g, b) = (1, 1, 0)
        }

        self.init(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue: CGFloat(b) / 255, alpha: CGFloat(255 * alpha) / 255)
    }

    convenience init(r: CGFloat, g: CGFloat, b: CGFloat, a: CGFloat = 1) {
        self.init(red: (r / 255), green: (g / 255), blue: (b / 255), alpha: a)
    }
}

Use them like below:

UIColor(0xF54A45)
UIColor(0xF54A45, alpha: 0.7)
UIColor("#f44")
UIColor("#f44", alpha: 0.7)
UIColor("#F54A45")
UIColor("#F54A45", alpha: 0.7)
UIColor("F54A45")
UIColor("F54A45", alpha: 0.7)
UIColor(r: 245.0, g: 73, b: 69)
UIColor(r: 245.0, g: 73, b: 69, a: 0.7)

enter image description here

Raymond Liao
  • 1,799
  • 3
  • 20
  • 32
16

This answer shows how to do it in Obj-C. The bridge is to use

let rgbValue = 0xFFEEDD
let r = Float((rgbValue & 0xFF0000) >> 16)/255.0
let g = Float((rgbValue & 0xFF00) >> 8)/255.0
let b = Float((rgbValue & 0xFF))/255.0
self.backgroundColor = UIColor(red:r, green: g, blue: b, alpha: 1.0)
Community
  • 1
  • 1
Grimxn
  • 22,115
  • 10
  • 72
  • 85
14

Swift 5: You can create colors in Xcode as explained in the following two images:

enter image description here

You should name the color because you reference the color by the name. As shown in image 2:

enter image description here

simibac
  • 7,672
  • 3
  • 36
  • 48
6

Another method

Swift 3.0

Write a extension for UIColor

// To change the HexaDecimal value to Corresponding Color
extension UIColor
{
    class func uicolorFromHex(_ rgbValue:UInt32, alpha : CGFloat)->UIColor

    {
        let red = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
        let green = CGFloat((rgbValue & 0xFF00) >> 8) / 255.0
        let blue = CGFloat(rgbValue & 0xFF) / 255.0
        return UIColor(red:red, green:green, blue:blue, alpha: alpha)
    }
}

you can directly create UIColor with hex like this

let carrot = UIColor.uicolorFromHex(0xe67e22, alpha: 1))
6

Xcode 13.2.1, M1, Swift 5.5

We can use Hex in ColorLiterals

type #colorLiteral( in Xcode and that will trigger and fix the bug related to ColorLiterals

then click in other

enter image description here

and then select RGB slider and you can see now the panel for Hex

enter image description here

Marlhex
  • 1,870
  • 19
  • 27
5

Here's a Swift extension on UIColor that takes a hex string:

import UIKit

extension UIColor {

    convenience init(hexString: String) {
        // Trim leading '#' if needed
        var cleanedHexString = hexString
        if hexString.hasPrefix("#") {
//            cleanedHexString = dropFirst(hexString) // Swift 1.2
            cleanedHexString = String(hexString.characters.dropFirst()) // Swift 2
        }

        // String -> UInt32
        var rgbValue: UInt32 = 0
        NSScanner(string: cleanedHexString).scanHexInt(&rgbValue)

        // UInt32 -> R,G,B
        let red = CGFloat((rgbValue >> 16) & 0xff) / 255.0
        let green = CGFloat((rgbValue >> 08) & 0xff) / 255.0
        let blue = CGFloat((rgbValue >> 00) & 0xff) / 255.0

        self.init(red: red, green: green, blue: blue, alpha: 1.0)
    }

}
brycejl
  • 1,411
  • 17
  • 22
jrc
  • 20,354
  • 10
  • 69
  • 64
5

Latest swift3 Version

        extension UIColor {
convenience init(hexString: String) {
    let hex = hexString.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
    var int = UInt32()
    Scanner(string: hex).scanHexInt32(&int)
    let a, r, g, b: UInt32
    switch hex.characters.count {
    case 3: // RGB (12-bit)
        (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
    case 6: // RGB (24-bit)
        (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
    case 8: // ARGB (32-bit)
        (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
    default:
        (a, r, g, b) = (255, 0, 0, 0)
    }
      self.init(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue:      CGFloat(b) / 255, alpha: CGFloat(a) / 255)
}
}

Use in your class or where ever you converted into hexcolor to uicolor like in this way

             let color1 = UIColor(hexString: "#FF323232")
5
public static func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }

    if ((cString.characters.count) == 6) {

        var rgbValue:UInt32 = 0
        Scanner(string: cString).scanHexInt32(&rgbValue)

        return UIColor(
            red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
            green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
            blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
            alpha: CGFloat(1.0)
        )
    }else if ((cString.characters.count) == 8) {

        var rgbValue:UInt32 = 0
        Scanner(string: cString).scanHexInt32(&rgbValue)

        return UIColor(
            red: CGFloat((rgbValue & 0x00FF0000) >> 16) / 255.0,
            green: CGFloat((rgbValue & 0x0000FF00) >> 8) / 255.0,
            blue: CGFloat(rgbValue & 0x000000FF) / 255.0,
            alpha: CGFloat((rgbValue & 0xFF000000) >> 24) / 255.0
        )
    }else{
        return UIColor.gray
    }
}

How to use

var color: UIColor = hexStringToUIColor(hex: "#00ff00"); // Without transparency
var colorWithTransparency: UIColor = hexStringToUIColor(hex: "#dd00ff00"); // With transparency
Atif Mahmood
  • 8,882
  • 2
  • 41
  • 44
5

Simple Color Extension for Swift 5/SwiftUI

Example:

let myColor = Color(hex:0xF2C94C)

Code:

import Foundation
import SwiftUI

extension UIColor {
    convenience init(hex: Int) {
        let components = (
            R: CGFloat((hex >> 16) & 0xff) / 255,
            G: CGFloat((hex >> 08) & 0xff) / 255,
            B: CGFloat((hex >> 00) & 0xff) / 255
        )
        self.init(red: components.R, green: components.G, blue: components.B, alpha: 1)
    }
}

extension Color {
    public init(hex: Int) {
        self.init(UIColor(hex: hex))
   }
}
Zorayr
  • 23,770
  • 8
  • 136
  • 129
4

Hex with validation

Based on Eduardo answer

Details

  • Xcode 10.0, Swift 4.2
  • Xcode 10.2.1 (10E1001), Swift 5

Solution

import UIKit

extension UIColor {

    convenience init(r: UInt8, g: UInt8, b: UInt8, alpha: CGFloat = 1.0) {
        let divider: CGFloat = 255.0
        self.init(red: CGFloat(r)/divider, green: CGFloat(g)/divider, blue: CGFloat(b)/divider, alpha: alpha)
    }

    private convenience init(rgbWithoutValidation value: Int32, alpha: CGFloat = 1.0) {
        self.init(
            r: UInt8((value & 0xFF0000) >> 16),
            g: UInt8((value & 0x00FF00) >> 8),
            b: UInt8(value & 0x0000FF),
            alpha: alpha
        )
    }

    convenience init?(rgb: Int32, alpha: CGFloat = 1.0) {
        if rgb > 0xFFFFFF || rgb < 0 { return nil }
        self.init(rgbWithoutValidation: rgb, alpha: alpha)
    }

    convenience init?(hex: String, alpha: CGFloat = 1.0) {
        var charSet = CharacterSet.whitespacesAndNewlines
        charSet.insert("#")
        let _hex = hex.trimmingCharacters(in: charSet)
        guard _hex.range(of: "^[0-9A-Fa-f]{6}$", options: .regularExpression) != nil else { return nil }
        var rgb: UInt32 = 0
        Scanner(string: _hex).scanHexInt32(&rgb)
        self.init(rgbWithoutValidation: Int32(rgb), alpha: alpha)
    }
}

Usage

let alpha: CGFloat = 1.0

// Hex
print(UIColor(rgb: 0x4F9BF5) ?? "nil")
print(UIColor(rgb: 0x4F9BF5, alpha: alpha) ?? "nil")
print(UIColor(rgb: 5217269) ?? "nil")
print(UIColor(rgb: -5217269) ?? "nil")                  // = nil
print(UIColor(rgb: 0xFFFFFF1) ?? "nil")                 // = nil

// String
print(UIColor(hex: "4F9BF5") ?? "nil")
print(UIColor(hex: "4F9BF5", alpha: alpha) ?? "nil")
print(UIColor(hex: "#4F9BF5") ?? "nil")
print(UIColor(hex: "#4F9BF5", alpha: alpha) ?? "nil")
print(UIColor(hex: "#4F9BF56") ?? "nil")                // = nil
print(UIColor(hex: "#blabla") ?? "nil")                 // = nil

// RGB
print(UIColor(r: 79, g: 155, b: 245))
print(UIColor(r: 79, g: 155, b: 245, alpha: alpha))
//print(UIColor(r: 792, g: 155, b: 245, alpha: alpha))  // Compiler will throw an error, r,g,b = [0...255]
Vasily Bodnarchuk
  • 24,482
  • 9
  • 132
  • 127
  • 2
    You don't need `NSPredicate` just to test regular expressions. `string.range(of: pattern, options: .regularExpression)` works too. – Sulthan Oct 07 '18 at 12:57
2

You can use it in swift 5

SWIFT 5

import UIKit

extension UIColor {
    static func hexStringToUIColor (hex:String) -> UIColor {
        var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

        if (cString.hasPrefix("#")) {
            cString.remove(at: cString.startIndex)
        }

        if ((cString.count) != 6) {
            return UIColor.gray
        }

        var rgbValue:UInt32 = 0
        Scanner(string: cString).scanHexInt32(&rgbValue)

        return UIColor(
            red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
            green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
            blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
            alpha: CGFloat(1.0)
        )
    }
}
mohsen
  • 4,698
  • 1
  • 33
  • 54
1

Swift 2.0:

In viewDidLoad()

 var viewColor:UIColor
    viewColor = UIColor()
    let colorInt:UInt
    colorInt = 0x000000
    viewColor = UIColorFromRGB(colorInt)
    self.View.backgroundColor=viewColor



func UIColorFromRGB(rgbValue: UInt) -> UIColor {
    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}
The King
  • 29
  • 2
1
extension UIColor {
    public convenience init?(hex: String) {
        let r, g, b, a: CGFloat

        if hex.hasPrefix("#") {
            let start = hex.index(hex.startIndex, offsetBy: 1)
            let hexColor = String(hex[start...])

            if hexColor.count == 8 {
                let scanner = Scanner(string: hexColor)
                var hexNumber: UInt64 = 0

                if scanner.scanHexInt64(&hexNumber) {
                    r = CGFloat((hexNumber & 0xff000000) >> 24) / 255
                    g = CGFloat((hexNumber & 0x00ff0000) >> 16) / 255
                    b = CGFloat((hexNumber & 0x0000ff00) >> 8) / 255
                    a = CGFloat(hexNumber & 0x000000ff) / 255

                    self.init(red: r, green: g, blue: b, alpha: a)
                    return
                }
            }
        }

        return nil
    }
}

Usage:

let white = UIColor(hex: "#ffffff")
developer1996
  • 827
  • 2
  • 14
  • 26
1

Swift 5

extension UIColor{

/// Converting hex string to UIColor
///
/// - Parameter hexString: input hex string
convenience init(hexString: String) {
    let hex = hexString.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
    var int = UInt64()
    Scanner(string: hex).scanHexInt64(&int)
    let a, r, g, b: UInt64
    switch hex.count {
    case 3:    
        (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
    case 6: 
        (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
    case 8: 
        (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
    default:
        (a, r, g, b) = (255, 0, 0, 0)
    }
    self.init(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue: CGFloat(b) / 255, alpha: CGFloat(a) / 255)
}
}

Call using UIColor(hexString: "your hex string")

Thomas Paul
  • 355
  • 5
  • 13
1

iOS 14, SwiftUI 2.0, swift 5.1, Xcode beta12

extension Color {
  static func hexColour(hexValue:UInt32)->Color
    {
      let red = Double((hexValue & 0xFF0000) >> 16) / 255.0
      let green = Double((hexValue & 0xFF00) >> 8) / 255.0
      let blue = Double(hexValue & 0xFF) / 255.0
      return Color(red:red, green:green, blue:blue)
    }
}

You call it with a hex number

let red = Color.hexColour(hexValue: 0xFF0000)
user3069232
  • 8,587
  • 7
  • 46
  • 87
1

Swift 5.0

You cannot use the #ffffff syntax directly in Swift. Here is the code I use for web-related projects. Supports alpha and three-digits.

Usage example (uppercase values are fine too):

    let hex = "#FADE2B"  // yellow
    let color = NSColor(fromHex: hex)

Supported string formats:

  • "fff" // RGB
  • "#fff" // #RGB
  • "ffff" // RGBA
  • "#ffff" // #RGBA
  • "ffffff" // RRGGBB
  • "#ffffff" // #RRGGBB
  • "ffffffff" // RRGGBBAA
  • "#ffffffff" // #RRGGBBAA

The digits represent Red, Green, Blue and Alpha (like transparency). For iOS, replace NSColor with UIColor.

Code:


    extension NSColor {
        /// Initialises NSColor from a hexadecimal string. Color is clear if string is invalid.
        /// - Parameter fromHex: supported formats are "#RGB", "#RGBA", "#RRGGBB", "#RRGGBBAA", with or without the # character
        public convenience init(fromHex:String) {
            var r = 0, g = 0, b = 0, a = 255
            let offset = fromHex.hasPrefix("#") ? 1 : 0
            let ch = fromHex.map{$0}
            switch(ch.count - offset) {
            case 8:
                a = 16 * (ch[offset+6].hexDigitValue ?? 0) + (ch[offset+7].hexDigitValue ?? 0)
                fallthrough
            case 6:
                r = 16 * (ch[offset+0].hexDigitValue ?? 0) + (ch[offset+1].hexDigitValue ?? 0)
                g = 16 * (ch[offset+2].hexDigitValue ?? 0) + (ch[offset+3].hexDigitValue ?? 0)
                b = 16 * (ch[offset+4].hexDigitValue ?? 0) + (ch[offset+5].hexDigitValue ?? 0)
                break
            case 4:
                a = 16 * (ch[offset+3].hexDigitValue ?? 0) + (ch[offset+3].hexDigitValue ?? 0)
                fallthrough
            case 3:  // Three digit #0D3 is the same as six digit #00DD33
                r = 16 * (ch[offset+0].hexDigitValue ?? 0) + (ch[offset+0].hexDigitValue ?? 0)
                g = 16 * (ch[offset+1].hexDigitValue ?? 0) + (ch[offset+1].hexDigitValue ?? 0)
                b = 16 * (ch[offset+2].hexDigitValue ?? 0) + (ch[offset+2].hexDigitValue ?? 0)
                break
            default:
                a = 0
                break
            }
            self.init(red: CGFloat(r)/255, green: CGFloat(g)/255, blue: CGFloat(b)/255, alpha: CGFloat(a)/255)
            
        }
    }
    // Author: Andrew Kingdom

License: CC BY

I find this less messy for copy/pasting than the following

Alternatives:

You can remove the # and store it as a 32-bit unsigned integer literal, denoted by a 0x prefix thus: 0xffffff. You still need code to convert this to a colour though.

If you're wanting a non-programatic way to get the color: Open a colour selector dialog and switch to Colour Sliders > RGB Sliders and paste/enter the value into the 'Hex Color #' box. (Don't paste the # hash symbol.)

Andrew Kingdom
  • 188
  • 1
  • 9
0

Supporting 7 Hex color types

There are 7 hex color formats: ""#FF0000","0xFF0000", "FF0000", "F00", "red", 0x00FF00 , 16711935

NSColorParser.nsColor("#FF0000",1)//red nsColor
NSColorParser.nsColor("FF0",1)//red nsColor
NSColorParser.nsColor("0xFF0000",1)//red nsColor
NSColorParser.nsColor("#FF0000",1)//red nsColor
NSColorParser.nsColor("FF0000",1)//red nsColor
NSColorParser.nsColor(0xFF0000,1)//red nsColor
NSColorParser.nsColor(16711935,1)//red nsColor

CAUTION: This isn't a "one-file-solution", there are some dependencies, but hunting them down may be faster than researching this from scratch.

https://github.com/eonist/swift-utils/blob/2882002682c4d2a3dc7cb3045c45f66ed59d566d/geom/color/NSColorParser.swift

Permalink:
https://github.com/eonist/Element/wiki/Progress#supporting-7-hex-color-types

Sentry.co
  • 5,355
  • 43
  • 38
0

Swift 2.0

The code below is tested on xcode 7.2

import UIKit
extension UIColor{

    public convenience init?(colorCodeInHex: String, alpha: Float = 1.0){

        var filterColorCode:String =  colorCodeInHex.stringByReplacingOccurrencesOfString("#", withString: "")

        if  filterColorCode.characters.count != 6 {
            self.init(red: 0.0, green: 0.0, blue: 0.0, alpha: CGFloat(alpha))
            return
        }

        filterColorCode = filterColorCode.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString

        var range = Range(start: filterColorCode.startIndex.advancedBy(0), end: filterColorCode.startIndex.advancedBy(2))
        let rString = filterColorCode.substringWithRange(range)

        range = Range(start: filterColorCode.startIndex.advancedBy(2), end: filterColorCode.startIndex.advancedBy(4))
        let gString = filterColorCode.substringWithRange(range)


        range = Range(start: filterColorCode.startIndex.advancedBy(4), end: filterColorCode.startIndex.advancedBy(6))
        let bString = filterColorCode.substringWithRange(range)

        var r:CUnsignedInt = 0, g:CUnsignedInt = 0, b:CUnsignedInt = 0;
        NSScanner(string: rString).scanHexInt(&r)
        NSScanner(string: gString).scanHexInt(&g)
        NSScanner(string: bString).scanHexInt(&b)


        self.init(red: CGFloat(r) / 255.0, green: CGFloat(g) / 255.0, blue: CGFloat(b) / 255.0, alpha: CGFloat(alpha))
        return
    }
}
Kev
  • 118,037
  • 53
  • 300
  • 385
Chamath Jeevan
  • 5,072
  • 1
  • 24
  • 27
0

Swift 2.0:

Make an extension of UIColor.

extension UIColor {
    convenience init(hexString:String) {
        let hexString:NSString = hexString.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet())
        let scanner            = NSScanner(string: hexString as String)
        if (hexString.hasPrefix("#")) {
            scanner.scanLocation = 1
        }

        var color:UInt32 = 0
        scanner.scanHexInt(&color)

        let mask = 0x000000FF
        let r = Int(color >> 16) & mask
        let g = Int(color >> 8) & mask
        let b = Int(color) & mask

        let red   = CGFloat(r) / 255.0
        let green = CGFloat(g) / 255.0
        let blue  = CGFloat(b) / 255.0
        self.init(red:red, green:green, blue:blue, alpha:1)
    }

    func toHexString() -> String {
        var r:CGFloat = 0
        var g:CGFloat = 0
        var b:CGFloat = 0
        var a:CGFloat = 0
        getRed(&r, green: &g, blue: &b, alpha: &a)
        let rgb:Int = (Int)(r*255)<<16 | (Int)(g*255)<<8 | (Int)(b*255)<<0
        return NSString(format:"#%06x", rgb) as String
    }

}

Usage:

//Hex to Color
    let countPartColor =  UIColor(hexString: "E43038")

//Color to Hex
let colorHexString =  UIColor(red: 228, green: 48, blue: 56, alpha: 1.0).toHexString()
Alvin George
  • 14,148
  • 92
  • 64
  • 1
    Please avoid posting [duplicate answers](http://stackoverflow.com/a/35428135/2227743). If a question is a duplicate, flag it as such instead of answering. Thank you. – Eric Aya Apr 15 '16 at 10:48
0

For swift 3

extension String {
    var hexColor: UIColor {        
        let hex = trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
        var int = UInt32()       
        Scanner(string: hex).scanHexInt32(&int)
        let a, r, g, b: UInt32
        switch hex.characters.count {
        case 3: // RGB (12-bit)
            (a, r, g, b) = (255, (int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (a, r, g, b) = (255, int >> 16, int >> 8 & 0xFF, int & 0xFF)
        case 8: // ARGB (32-bit)
            (a, r, g, b) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
        default:
            return .clear
        }
        return UIColor(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue: CGFloat(b) / 255, alpha: CGFloat(a) / 255)
    }
}
Museer Ahamad Ansari
  • 5,414
  • 3
  • 39
  • 45
Samira
  • 844
  • 13
  • 17
0

You can use this extension on UIColor which converts Your String (Hexadecimal , RGBA) to UIColor and vice versa.

extension UIColor {

  //Convert RGBA String to UIColor object
  //"rgbaString" must be separated by space "0.5 0.6 0.7 1.0" 50% of Red 60% of Green 70% of Blue Alpha 100%
  public convenience init?(rgbaString : String){
      self.init(ciColor: CIColor(string: rgbaString))
  }

  //Convert UIColor to RGBA String
  func toRGBAString()-> String {
    var r: CGFloat = 0
    var g: CGFloat = 0
    var b: CGFloat = 0
    var a: CGFloat = 0
    self.getRed(&r, green: &g, blue: &b, alpha: &a)
    return "\(r) \(g) \(b) \(a)"
  }

  //return UIColor from Hexadecimal Color string
  public convenience init?(hexString: String) {  
    let r, g, b, a: CGFloat

    if hexString.hasPrefix("#") {
      let start = hexString.index(hexString.startIndex, offsetBy: 1)
      let hexColor = hexString.substring(from: start)

      if hexColor.characters.count == 8 {
        let scanner = Scanner(string: hexColor)
        var hexNumber: UInt64 = 0

        if scanner.scanHexInt64(&hexNumber) {
          r = CGFloat((hexNumber & 0xff000000) >> 24) / 255
          g = CGFloat((hexNumber & 0x00ff0000) >> 16) / 255
          b = CGFloat((hexNumber & 0x0000ff00) >> 8) / 255
          a = CGFloat(hexNumber & 0x000000ff) / 255
          self.init(red: r, green: g, blue: b, alpha: a)
          return
        }
      }
    }

    return nil
  }

  // Convert UIColor to Hexadecimal String
  func toHexString() -> String {
    var r: CGFloat = 0
    var g: CGFloat = 0
    var b: CGFloat = 0
    var a: CGFloat = 0
    self.getRed(&r, green: &g, blue: &b, alpha: &a)
    return String(
        format: "%02X%02X%02X",
        Int(r * 0xff),
        Int(g * 0xff),
        Int(b * 0xff))
  }
}
Rags93
  • 1,424
  • 1
  • 14
  • 14
Sudhir kumar
  • 659
  • 5
  • 21
0

UIColor extension, This will greatly help you! (version:Swift 4.0)

import UIKit
extension UIColor {
/// rgb颜色
convenience init(r: CGFloat, g: CGFloat, b: CGFloat) {
    self.init(red: r/255.0 ,green: g/255.0 ,blue: b/255.0 ,alpha:1.0)
}

/// 纯色(用于灰色)
convenience init(gray: CGFloat) {
    self.init(red: gray/255.0 ,green: gray/255.0 ,blue: gray/255.0 ,alpha:1.0)
}
/// 随机色
class func randomCGColor() -> UIColor {
    return UIColor(r: CGFloat(arc4random_uniform(256)), g: CGFloat(arc4random_uniform(256)), b: CGFloat(arc4random_uniform(256)))
}

/// hex颜色-Int
convenience init(hex:Int, alpha:CGFloat = 1.0) {
    self.init(
        red:   CGFloat((hex & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((hex & 0x00FF00) >> 8)  / 255.0,
        blue:  CGFloat((hex & 0x0000FF) >> 0)  / 255.0,
        alpha: alpha
    )
}
/// hex颜色-String
convenience init(hexString: String){
    var red:   CGFloat = 0.0
    var green: CGFloat = 0.0
    var blue:  CGFloat = 0.0
    var alpha: CGFloat = 1.0
    let scanner = Scanner(string: hexString)
    var hexValue: CUnsignedLongLong = 0
    if scanner.scanHexInt64(&hexValue) {
        switch (hexString.characters.count) {
        case 3:
            red   = CGFloat((hexValue & 0xF00) >> 8)       / 15.0
            green = CGFloat((hexValue & 0x0F0) >> 4)       / 15.0
            blue  = CGFloat(hexValue & 0x00F)              / 15.0
        case 4:
            red   = CGFloat((hexValue & 0xF000) >> 12)     / 15.0
            green = CGFloat((hexValue & 0x0F00) >> 8)      / 15.0
            blue  = CGFloat((hexValue & 0x00F0) >> 4)      / 15.0
            alpha = CGFloat(hexValue & 0x000F)             / 15.0
        case 6:
            red   = CGFloat((hexValue & 0xFF0000) >> 16)   / 255.0
            green = CGFloat((hexValue & 0x00FF00) >> 8)    / 255.0
            blue  = CGFloat(hexValue & 0x0000FF)           / 255.0
        case 8:
            alpha = CGFloat((hexValue & 0xFF000000) >> 24) / 255.0
            red   = CGFloat((hexValue & 0x00FF0000) >> 16) / 255.0
            green = CGFloat((hexValue & 0x0000FF00) >> 8)  / 255.0
            blue  = CGFloat(hexValue & 0x000000FF)         / 255.0
        default:
            log.info("Invalid RGB string, number of characters after '#' should be either 3, 4, 6 or 8")
        }
    } else {
        log.error("Scan hex error")
    }
    self.init(red:red, green:green, blue:blue, alpha:alpha)
}}
Tim
  • 1,528
  • 1
  • 11
  • 8
0

RGBA Version Swift 3/4

I like @Luca's answer as i think it's the most elegant.

However I don't want my colours specified in ARGB. I'd rather RGBA + also i needed to hack in the case of dealing with strings that specify 1 character for each of the channels "#FFFA".

This version also adds error throwing + strips the '#' character if it's included in the string. Here is my modified form for Swift.

public enum ColourParsingError: Error
{
    
    case invalidInput(String)
}
extension UIColor {
    public convenience init(hexString: String) throws
    {
        let hexString = hexString.replacingOccurrences(of: "#", with: "")
        let hex = hexString.trimmingCharacters(in:NSCharacterSet.alphanumerics.inverted)
        var int = UInt32()
        Scanner(string: hex).scanHexInt32(&int)
        let a, r, g, b: UInt32
        switch hex.count 
        {
        case 3: // RGB (12-bit)
            (r, g, b,a) = ((int >> 8) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17,255)
        //iCSS specification in the form of #F0FA
        case 4: // RGB (24-bit)
            (r, g, b,a) = ((int >> 12) * 17, (int >> 8 & 0xF) * 17, (int >> 4 & 0xF) * 17, (int & 0xF) * 17)
        case 6: // RGB (24-bit)
            (r, g, b, a) = (int >> 16, int >> 8 & 0xFF, int & 0xFF,255)
        case 8: // ARGB (32-bit)
            (r, g, b, a) = (int >> 24, int >> 16 & 0xFF, int >> 8 & 0xFF, int & 0xFF)
        default:
            throw ColourParsingError.invalidInput("String is not a valid hex colour string: \(hexString)")
        }
        self.init(red: CGFloat(r) / 255, green: CGFloat(g) / 255, blue: CGFloat(b) / 255, alpha: CGFloat(a) / 255)
    }
}
Community
  • 1
  • 1
Chris Birch
  • 2,041
  • 2
  • 19
  • 22
0

Try this

UIColor(hexString: "#ffffff")

.

extension UIColor {
  convenience init?(hexString hex: String) {
    var cString: String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if cString.hasPrefix("#") {
      cString.remove(at: cString.startIndex)
    }

    guard cString.count == 6 else {
      return nil
    }

    var rgbValue: UInt64 = 0
    Scanner(string: cString).scanHexInt64(&rgbValue)

    self.init(
      red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
      green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
      blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
      alpha: CGFloat(1.0)
    )
  }
}
Neenu
  • 6,848
  • 2
  • 28
  • 54
-1

Just some addiotion to the first answer

(haven't cehcked the alpha, may need to add an if netHext > 0xffffff):

extension UIColor {

struct COLORS_HEX {
    static let Primary = 0xffffff
    static let PrimaryDark = 0x000000
    static let Accent = 0xe89549
    static let AccentDark = 0xe27b2a
    static let TextWhiteSemiTransparent = 0x80ffffff
}

convenience init(red: Int, green: Int, blue: Int, alphaH: Int) {
    assert(red >= 0 && red <= 255, "Invalid red component")
    assert(green >= 0 && green <= 255, "Invalid green component")
    assert(blue >= 0 && blue <= 255, "Invalid blue component")
    assert(alphaH >= 0 && alphaH <= 255, "Invalid alpha component")

    self.init(red: CGFloat(red) / 255.0, green: CGFloat(green) / 255.0, blue: CGFloat(blue) / 255.0, alpha: CGFloat(alphaH) / 255.0)
}

convenience init(netHex:Int) {
    self.init(red:(netHex >> 16) & 0xff, green:(netHex >> 8) & 0xff, blue:netHex & 0xff, alphaH: (netHex >> 24) & 0xff)
}

}
-1

I made a small function,placed it from where I can use it globally & working fine with swift 2.1:

func getColorFromHex(rgbValue:UInt32)->UIColor{
   let red = CGFloat((rgbValue & 0xFF0000) >> 16)/255.0
   let green = CGFloat((rgbValue & 0xFF00) >> 8)/255.0
   let blue = CGFloat(rgbValue & 0xFF)/255.0

   return UIColor(red:red, green:green, blue:blue, alpha:1.0)
}

usage:

getColorFromHex(0xffffff)
Irshad Qureshi
  • 807
  • 18
  • 41
  • 1
    You should be dividing by `255` not `256`. There is no way to get white using your code. Only "almost white". – Sulthan Mar 26 '17 at 10:15
-1

Swift 2.3: UIColor Extension. I Think its simpler.

extension UIColor {
    static func colorFromHex(hexString: String, alpha: CGFloat = 1) -> UIColor {
        //checking if hex has 7 characters or not including '#'
        if hexString.characters.count < 7 {
            return UIColor.whiteColor()
        }
        //string by removing hash
        let hexStringWithoutHash = hexString.substringFromIndex(hexString.startIndex.advancedBy(1))

        //I am extracting three parts of hex color Red (first 2 characters), Green (middle 2 characters), Blue (last two characters)
        let eachColor = [
            hexStringWithoutHash.substringWithRange(hexStringWithoutHash.startIndex...hexStringWithoutHash.startIndex.advancedBy(1)),
            hexStringWithoutHash.substringWithRange(hexStringWithoutHash.startIndex.advancedBy(2)...hexStringWithoutHash.startIndex.advancedBy(3)),
            hexStringWithoutHash.substringWithRange(hexStringWithoutHash.startIndex.advancedBy(4)...hexStringWithoutHash.startIndex.advancedBy(5))]

        let hexForEach = eachColor.map {CGFloat(Int($0, radix: 16) ?? 0)} //radix is base of numeric system you want to convert to, Hexadecimal has base 16

        //return the color by making color
        return UIColor(red: hexForEach[0] / 255, green: hexForEach[1] / 255, blue: hexForEach[2] / 255, alpha: alpha)
    }
}

Usage:

let color = UIColor.colorFromHex("#25ac09")
Bibek
  • 3,689
  • 3
  • 19
  • 28
-1

Swift 4.0

use this Single line of method

override func viewDidLoad() {
    super.viewDidLoad()

   let color = UIColor(hexColor: "FF00A0")
   self.view.backgroundColor = color

}

You have to create new Class or use any controller where u need to use Hex color. This extension class provide you UIColor that will convert Hex to RGB color.

extension UIColor {
convenience init(hexColor: String) {
    let scannHex = Scanner(string: hexColor)
    var rgbValue: UInt64 = 0
    scannHex.scanLocation = 0
    scannHex.scanHexInt64(&rgbValue)
    let r = (rgbValue & 0xff0000) >> 16
    let g = (rgbValue & 0xff00) >> 8
    let b = rgbValue & 0xff
    self.init(
        red: CGFloat(r) / 0xff,
        green: CGFloat(g) / 0xff,
        blue: CGFloat(b) / 0xff, alpha: 1
    )
  }
}
Sandeep Sachan
  • 373
  • 3
  • 15
-2
extension UIColor {

      convenience init(hex: Int, alpha: Double = 1.0) {

      self.init(red: CGFloat((hex>>16)&0xFF)/255.0, green:CGFloat((hex>>8)&0xFF)/255.0, blue: CGFloat((hex)&0xFF)/255.0, alpha:  CGFloat(255 * alpha) / 255)
     }
}

Use this extension like:

let selectedColor = UIColor(hex: 0xFFFFFF)
let selectedColor = UIColor(hex: 0xFFFFFF, alpha: 0.5)
Atulkumar V. Jain
  • 5,102
  • 9
  • 44
  • 61
Himanshu Mahajan
  • 4,779
  • 2
  • 36
  • 29
-2
extension UIColor {

    convenience init(r: CGFloat, g: CGFloat, b: CGFloat, a: CGFloat = 1) {
        self.init(red: r/255, green: g/255, blue: b/255, alpha: a)
    }

    convenience init(hex: Int, alpha: CGFloat = 1) {
        self.init(r: CGFloat((hex >> 16) & 0xff), g: CGFloat((hex >> 08) & 0xff), b: CGFloat((hex >> 00) & 0xff), a: alpha)
    }
}
yakovlevvl
  • 524
  • 3
  • 12