62

I would like to get the RGB Value of an UIColor in Swift:

let swiftColor = UIColor(red: 1, green: 165/255, blue: 0, alpha: 1)
println("RGB Value is:");
println(swiftColor.getRGB()); <<<<<< How to do that ?

In Java I would do it as follows:

Color cnew = new Color();
int iColor = cnew.rgb(1, 165/255, 0);
System.out.println(iColor);

How should I get this value?

Alexander Telegin
  • 665
  • 1
  • 7
  • 22
mcfly soft
  • 11,289
  • 26
  • 98
  • 202

7 Answers7

97

The Java getRGB() returns an integer representing the color in the default sRGB color space (bits 24-31 are alpha, 16-23 are red, 8-15 are green, 0-7 are blue).

UIColor does not have such a method, but you can define your own:

extension UIColor {

    func rgb() -> Int? {
        var fRed : CGFloat = 0
        var fGreen : CGFloat = 0
        var fBlue : CGFloat = 0
        var fAlpha: CGFloat = 0
        if self.getRed(&fRed, green: &fGreen, blue: &fBlue, alpha: &fAlpha) {
            let iRed = Int(fRed * 255.0)
            let iGreen = Int(fGreen * 255.0)
            let iBlue = Int(fBlue * 255.0)
            let iAlpha = Int(fAlpha * 255.0)

            //  (Bits 24-31 are alpha, 16-23 are red, 8-15 are green, 0-7 are blue).
            let rgb = (iAlpha << 24) + (iRed << 16) + (iGreen << 8) + iBlue
            return rgb
        } else {
            // Could not extract RGBA components:
            return nil
        }
    }
}

Usage:

let swiftColor = UIColor(red: 1, green: 165/255, blue: 0, alpha: 1)
if let rgb = swiftColor.rgb() {
    print(rgb)
} else {
    print("conversion failed")
}

Note that this will only work if the UIColor has been defined in an "RGB-compatible" colorspace (such as RGB, HSB or GrayScale). It may fail if the color has been created from an CIColor or a pattern image, in that case nil is returned.

Remark: As @vonox7 noticed, the returned value can be negative on 32-bit platforms (which is also the case with the Java getRGB() method). If that is not wanted, replace Int by UInt or Int64.

The reverse conversion is

extension UIColor {
    convenience init(rgb: Int) {
        let iBlue = rgb & 0xFF
        let iGreen =  (rgb >> 8) & 0xFF
        let iRed =  (rgb >> 16) & 0xFF
        let iAlpha =  (rgb >> 24) & 0xFF
        self.init(red: CGFloat(iRed)/255, green: CGFloat(iGreen)/255,
                  blue: CGFloat(iBlue)/255, alpha: CGFloat(iAlpha)/255)
    }
}
Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382
  • Thanks a lot. Thats what I searched for. Do you probably have the same back ? (I am very new in swift) -> getRed(iRGBValue:Int) -> Int ? – mcfly soft Feb 21 '15 at 11:51
  • 2
    @user1344545: I am not sure if I understand your question correctly. You can extract the components from the rgb value with simple shift+mask operations, for example `let red = (rgb >> 16) & 0xFF`. But if you want the values separately, then it would be easier to just compute iRed, iGreen, iBlue, iAlpha as in above method and not combine them to a single number. – Martin R Feb 21 '15 at 12:05
  • You understood correct. let red = (rgb >> 16) & 0xFF is what I looked for. Thank you so much ! – mcfly soft Feb 21 '15 at 12:16
  • Note that this code returns wrong values on 32bit devices! Every `Int` must be replaced with `Int64`. – vonox7 May 21 '17 at 16:15
  • @vonox7: The return value can indeed be negative. (I *assume* that the Java `getRGB` method, which is referred to in the question, behaves the same.) But I have added a remark to clarify that, thanks for the feedback! – Martin R May 21 '17 at 17:12
  • @MartinR I've replaced all Int with Int64 but I'm getting too big values. For example, I've got 4288243251 which converted to hex value is FF996633, and is not a valid hex color code (should be 6 digits). Any ideas? – Centurion Aug 16 '17 at 07:03
  • @Centurion: The most significat 8 bits hold the alpha value, in your case 0xFF = 255 corresponding to alpha = 1.0. That is a normal value for an opaque color. – Martin R Aug 16 '17 at 07:31
  • @MartinR is there a way to do the conversion backwards, from int to UIColor? – Andrey Solera Aug 25 '19 at 19:25
  • @Centurion you might get bigger/invalid values when operating on extended color spaces. See https://github.com/onevcat/Kingfisher/issues/1798 for a more detailed explanation why the above code is has issues. – vonox7 Sep 06 '21 at 16:30
61

From Martin R's answer :The method could also return a named tuple (a Swift 2 feature):

extension UIColor {

    func rgb() -> (red:Int, green:Int, blue:Int, alpha:Int)? {
        var fRed : CGFloat = 0
        var fGreen : CGFloat = 0
        var fBlue : CGFloat = 0
        var fAlpha: CGFloat = 0
        if self.getRed(&fRed, green: &fGreen, blue: &fBlue, alpha: &fAlpha) {
            let iRed = Int(fRed * 255.0)
            let iGreen = Int(fGreen * 255.0)
            let iBlue = Int(fBlue * 255.0)
            let iAlpha = Int(fAlpha * 255.0)

            return (red:iRed, green:iGreen, blue:iBlue, alpha:iAlpha)
        } else {
            // Could not extract RGBA components:
            return nil
        }
    }
}
Matthieu Riegler
  • 31,918
  • 20
  • 95
  • 134
  • I like this answer better. It's more informative and more ready-to-use. :) Thanks! – Chen Li Yong Aug 25 '16 at 07:03
  • It seems that alpha component shouldn't be multiplied to `255.0` and the last tuple component should be of type `Double`. – Yevhen Dubinin Jan 11 '17 at 17:39
  • Why would you not multiply the alpha by 255? If you're returning `Int`, and you don't multiply by 255, you'll only ever get 0 or 1, both of which are practically invisible. – user1118321 May 21 '17 at 17:18
36

Swift 3.0 IOS 10

let colour = UIColor.red
let rgbColour = colour.cgColor
let rgbColours = rgbColour.components
user3069232
  • 8,587
  • 7
  • 46
  • 87
7

Swift 4. Getting hex code (UInt) from UIColor:

extension UIColor {
    var coreImageColor: CIColor {
        return CIColor(color: self)
    }
    var hex: UInt {
        let red = UInt(coreImageColor.red * 255 + 0.5)
        let green = UInt(coreImageColor.green * 255 + 0.5)
        let blue = UInt(coreImageColor.blue * 255 + 0.5)
        return (red << 16) | (green << 8) | blue
    }
}
birdy
  • 943
  • 13
  • 25
6

Wrote an extension you can use. I chose to return the values as CGFloats rather than Ints because CGFloat is what the init method of UIColor takes

extension UIColor {
    var colorComponents: (red: CGFloat, green: CGFloat, blue: CGFloat, alpha: CGFloat)? {
        guard let components = self.cgColor.components else { return nil }

        return (
            red: components[0],
            green: components[1],
            blue: components[2],
            alpha: components[3]
        )
    }
}

Note: Swift 3.1/iOS 10 compatible, may not work in iOS 9 as UIColor.cgColor.components may be not be available

Zack Shapiro
  • 6,648
  • 17
  • 83
  • 151
  • 11
    crashes if the color is UIColor.white which I assume is built using the 2 parameter init of the UIColor UIColor(white: CGFloat, alpha: CGFloat). Shows up as two parameters instead of 4 and an index out of range exception occurs – DatForis Apr 19 '17 at 11:16
2

In swift you can use Color Literal, to get RGB Color.

enter image description here

Just like that

enter image description here

I hope this will be useful to someone.

Booharin
  • 753
  • 10
  • 10
0

Looks like its gotten simpler in recent versions, values are 0-1 so they need to be * 255 for a standard rbga result (except a)

let red = yourColor.lottieColorValue.r * 255
let blue= yourColor.lottieColorValue.b * 255
let green= yourColor.lottieColorValue.g * 255
let alpha = yourColor.lottieColorValue.a
edencorbin
  • 2,569
  • 5
  • 29
  • 44