How can I create a UIColor
from a hexadecimal string format, such as #00FF00
?

- 59,234
- 49
- 233
- 358

- 7,866
- 11
- 41
- 58
-
Erica also has a [great color extension category for iOS and OSX](https://github.com/erica/uicolor-utilities). – Echilon Jun 09 '13 at 13:55
-
Here is another library: https://github.com/burhanuddin353/TFTColor – Burhanuddin Sunelwala Dec 25 '16 at 20:20
50 Answers
I've found the simplest way to do this is with a macro. Just include it in your header and it's available throughout your project.
#define UIColorFromRGB(rgbValue) [UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 green:((float)((rgbValue & 0xFF00) >> 8))/255.0 blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
Also formatted version of this code:
#define UIColorFromRGB(rgbValue) \
[UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0x00FF00) >> 8))/255.0 \
blue:((float)((rgbValue & 0x0000FF) >> 0))/255.0 \
alpha:1.0]
Usage:
label.textColor = UIColorFromRGB(0xBC1128);
Swift:
static func UIColorFromRGB(_ rgbValue: Int) -> UIColor! {
return UIColor(
red: CGFloat((Float((rgbValue & 0xff0000) >> 16)) / 255.0),
green: CGFloat((Float((rgbValue & 0x00ff00) >> 8)) / 255.0),
blue: CGFloat((Float((rgbValue & 0x0000ff) >> 0)) / 255.0),
alpha: 1.0)
}

- 7,611
- 5
- 39
- 71

- 7,605
- 1
- 15
- 2
-
144This is great except it doesn't do what the questioner asks, which is to convert a hex STRING into a UIColor. This converts an integer to a UIColor. – darrinm Sep 12 '12 at 22:44
-
33@MohamedA.Karim That is an example of returning a UIColor from a hex format integer (0x...) not a hex format string ("#..."). Great if that's what you want, but not what the questioner asked for. – darrinm Nov 06 '12 at 02:28
-
@darrinm It's really not hard to change a HEX format string into a HEX format integer (You can do it with one short line of code using stringByReplacingOccurrencesOfString:@"#" withString:@"0x"). – Scott Kohlert Feb 21 '13 at 22:00
-
3@ScottKohlert Your line of code converts one hex format string (prefixed with "#") into another hex format string (prefixed with "0x"). It does not produce an integer. – darrinm Feb 21 '13 at 22:34
-
@darrinm the OP was looking for a string format starting with a #, I just added the line of code so it would convert the # into the format required by the #define statement (Ox...). See the OP's question: How can I create a UIColor from a hexadecimal STRING format, such as #00FF00 – Scott Kohlert Feb 21 '13 at 23:26
-
@ScottKohlert Perhaps some day the OP will return to bless one of these answers and we'll know for sure :). I assume Rupesh is dealing with the string at runtime, in which case what matters is converting the "#00FF00" string into parameters accepted by one of UIColor's constructors. Several answers here show ways to do this, but not the one we're commenting on. – darrinm Feb 22 '13 at 00:54
-
1
-
4To convert a hex format string to an integer for use with this macro, see http://stackoverflow.com/questions/3648411/objective-c-parse-hex-string-to-integer. – devios1 Aug 14 '14 at 21:18
-
-
1@ratsimihah There is an advantage to using as a macro but you won't find it in this particular solution, however had the macro used the `##` operator then your input could be just `FF0000` instead of `0xFF0000` which would be nice for lots of coders copy pasting hex colors from, say, photoshop. To accomplish this, just replace occurrences of `rgbValue` with `0x ## rgbValue` in the body of macro. – Albert Renshaw Nov 02 '18 at 00:16
-
If you have the hex in `NSString` format as the OP explicitly requests and want to use this nice function posted here, you can convert it like this: `unsigned long int hexValue = strtoul(hexString.UTF8String, NULL, 16);`. Full details here: https://stackoverflow.com/a/75163698/427959 – DTs Jan 18 '23 at 18:28
A concise solution:
// Assumes input like "#00FF00" (#RRGGBB).
+ (UIColor *)colorFromHexString:(NSString *)hexString {
unsigned rgbValue = 0;
NSScanner *scanner = [NSScanner scannerWithString:hexString];
[scanner setScanLocation:1]; // bypass '#' character
[scanner scanHexInt:&rgbValue];
return [UIColor colorWithRed:((rgbValue & 0xFF0000) >> 16)/255.0 green:((rgbValue & 0xFF00) >> 8)/255.0 blue:(rgbValue & 0xFF)/255.0 alpha:1.0];
}

- 7,611
- 5
- 39
- 71

- 9,117
- 5
- 34
- 34
-
1And a good method for doing the reverse conversion (like if you're storing colors in core data / a remote database) can be found here - http://stackoverflow.com/questions/11884227/how-do-i-convert-a-uicolor-to-a-hexadecimal-string – Eric G Mar 01 '14 at 01:04
-
1A perfect solution. If your hex string comes from a (very poorly documented) API, be sure to test against shorthand hex codes like #FFF or #FC0. You'll need to change them to #FFFFFF/#FFCCOO. – Patrick Feb 03 '15 at 22:53
-
10You might also want to add `if ( [hexString rangeOfString:@"#"].location == 0 )` before the `setScanLocation` line to make the `#` optional. – devios1 May 31 '15 at 22:09
-
-
11For the lazy: **SWIFT** version [here](https://gist.github.com/anonymous/fd07ecf47591c9f9ed1a). – fabian789 Sep 13 '15 at 14:05
-
Heh `[scanner setScanLocation:[hexString rangeOfString:@"#"].location+1];` – k06a Sep 25 '15 at 10:37
-
@Darrinm if i pass the value "364050" string to colorFromHexString its gives different color, do I need to include #364050 ? – kiran Apr 06 '17 at 05:39
I've got a solution that is 100% compatible with the hex format strings used by Android, which I found very helpful when doing cross-platform mobile development. It lets me use one color palate for both platforms. Feel free to reuse without attribution, or under the Apache license if you prefer.
#import "UIColor+HexString.h"
@interface UIColor(HexString)
+ (UIColor *) colorWithHexString: (NSString *) hexString;
+ (CGFloat) colorComponentFrom: (NSString *) string start: (NSUInteger) start length: (NSUInteger) length;
@end
@implementation UIColor(HexString)
+ (UIColor *) colorWithHexString: (NSString *) hexString {
NSString *colorString = [[hexString stringByReplacingOccurrencesOfString: @"#" withString: @""] uppercaseString];
CGFloat alpha, red, blue, green;
switch ([colorString length]) {
case 3: // #RGB
alpha = 1.0f;
red = [self colorComponentFrom: colorString start: 0 length: 1];
green = [self colorComponentFrom: colorString start: 1 length: 1];
blue = [self colorComponentFrom: colorString start: 2 length: 1];
break;
case 4: // #ARGB
alpha = [self colorComponentFrom: colorString start: 0 length: 1];
red = [self colorComponentFrom: colorString start: 1 length: 1];
green = [self colorComponentFrom: colorString start: 2 length: 1];
blue = [self colorComponentFrom: colorString start: 3 length: 1];
break;
case 6: // #RRGGBB
alpha = 1.0f;
red = [self colorComponentFrom: colorString start: 0 length: 2];
green = [self colorComponentFrom: colorString start: 2 length: 2];
blue = [self colorComponentFrom: colorString start: 4 length: 2];
break;
case 8: // #AARRGGBB
alpha = [self colorComponentFrom: colorString start: 0 length: 2];
red = [self colorComponentFrom: colorString start: 2 length: 2];
green = [self colorComponentFrom: colorString start: 4 length: 2];
blue = [self colorComponentFrom: colorString start: 6 length: 2];
break;
default:
[NSException raise:@"Invalid color value" format: @"Color value %@ is invalid. It should be a hex value of the form #RBG, #ARGB, #RRGGBB, or #AARRGGBB", hexString];
break;
}
return [UIColor colorWithRed: red green: green blue: blue alpha: alpha];
}
+ (CGFloat) colorComponentFrom: (NSString *) string start: (NSUInteger) start length: (NSUInteger) length {
NSString *substring = [string substringWithRange: NSMakeRange(start, length)];
NSString *fullHex = length == 2 ? substring : [NSString stringWithFormat: @"%@%@", substring, substring];
unsigned hexComponent;
[[NSScanner scannerWithString: fullHex] scanHexInt: &hexComponent];
return hexComponent / 255.0;
}
@end
Swift:
extension UIColor {
convenience init?(hexString: String?) {
let input: String! = (hexString ?? "")
.replacingOccurrences(of: "#", with: "")
.uppercased()
var alpha: CGFloat = 1.0
var red: CGFloat = 0
var blue: CGFloat = 0
var green: CGFloat = 0
switch (input.count) {
case 3 /* #RGB */:
red = Self.colorComponent(from: input, start: 0, length: 1)
green = Self.colorComponent(from: input, start: 1, length: 1)
blue = Self.colorComponent(from: input, start: 2, length: 1)
break
case 4 /* #ARGB */:
alpha = Self.colorComponent(from: input, start: 0, length: 1)
red = Self.colorComponent(from: input, start: 1, length: 1)
green = Self.colorComponent(from: input, start: 2, length: 1)
blue = Self.colorComponent(from: input, start: 3, length: 1)
break
case 6 /* #RRGGBB */:
red = Self.colorComponent(from: input, start: 0, length: 2)
green = Self.colorComponent(from: input, start: 2, length: 2)
blue = Self.colorComponent(from: input, start: 4, length: 2)
break
case 8 /* #AARRGGBB */:
alpha = Self.colorComponent(from: input, start: 0, length: 2)
red = Self.colorComponent(from: input, start: 2, length: 2)
green = Self.colorComponent(from: input, start: 4, length: 2)
blue = Self.colorComponent(from: input, start: 6, length: 2)
break
default:
NSException.raise(NSExceptionName("Invalid color value"), format: "Color value \"%@\" is invalid. It should be a hex value of the form #RBG, #ARGB, #RRGGBB, or #AARRGGBB", arguments:getVaList([hexString ?? ""]))
}
self.init(red: red, green: green, blue: blue, alpha: alpha)
}
static func colorComponent(from string: String!, start: Int, length: Int) -> CGFloat {
let substring = (string as NSString)
.substring(with: NSRange(location: start, length: length))
let fullHex = length == 2 ? substring : "\(substring)\(substring)"
var hexComponent: UInt64 = 0
Scanner(string: fullHex)
.scanHexInt64(&hexComponent)
return CGFloat(Double(hexComponent) / 255.0)
}
}

- 7,611
- 5
- 39
- 71

- 14,367
- 9
- 52
- 85
-
3in `colorComponentFrom:start:length:`, shouldn't you have `return hexComponent / 0xFF; // divide by 255, not 256` ? The largest hex value you should get back is 0xFF, thus that is what you should be dividing by 0xFF (255). – Sam Sep 13 '11 at 16:25
-
-
6This is great, cheers. Also, instead of a category on UIColor you could make one on NSString to be able to have syntax like `[@"#538aa4" toColor]` – Dan2552 Oct 12 '12 at 11:54
-
2This solution is great, I would suggest to add "Private" for the name of the private interface to avoid a compiler warning. @interface UIColor(Private) – djleop Dec 14 '12 at 09:56
-
-
Nice solution. However imo it should also be able to handle hexString=nil and give black – Roland Keesom Sep 10 '14 at 08:58
-
1Nice. You should put the *other* function in the interface, though. – Bjorn Roche Apr 22 '15 at 20:40
-
There's a nice post on how to tackle the OP's question of extracting a UIColor
from a hex string. The solution presented below is different from others because it supports string values that may include '0x' or '#' prefixed to the hex string representation... (see usage)
Here's the main bit...
- (UIColor *)getUIColorObjectFromHexString:(NSString *)hexStr alpha:(CGFloat)alpha
{
// Convert hex string to an integer
unsigned int hexint = [self intFromHexString:hexStr];
// Create a color object, specifying alpha as well
UIColor *color =
[UIColor colorWithRed:((CGFloat) ((hexint & 0xFF0000) >> 16))/255
green:((CGFloat) ((hexint & 0xFF00) >> 8))/255
blue:((CGFloat) (hexint & 0xFF))/255
alpha:alpha];
return color;
}
Helper method...
- (unsigned int)intFromHexString:(NSString *)hexStr
{
unsigned int hexInt = 0;
// Create scanner
NSScanner *scanner = [NSScanner scannerWithString:hexStr];
// Tell scanner to skip the # character
[scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:@"#"]];
// Scan hex value
[scanner scanHexInt:&hexInt];
return hexInt;
}
Usage:
NSString *hexStr1 = @"123ABC";
NSString *hexStr2 = @"#123ABC";
NSString *hexStr3 = @"0x123ABC";
UIColor *color1 = [self getUIColorObjectFromHexString:hexStr1 alpha:.9];
NSLog(@"UIColor: %@", color1);
UIColor *color2 = [self getUIColorObjectFromHexString:hexStr2 alpha:.9];
NSLog(@"UIColor: %@", color2);
UIColor *color3 = [self getUIColorObjectFromHexString:hexStr3 alpha:.9];
NSLog(@"UIColor: %@", color3);
Swift 2+
I've ported this solution to Swift 2.2. Note that I've changed the alpha
parameter to use a default set to 1.0. I've also updated the int type to UInt32
as required by the NSScanner
class in Swift 2.2.
func colorWithHexString(hexString: String, alpha:CGFloat = 1.0) -> UIColor {
// Convert hex string to an integer
let hexint = Int(self.intFromHexString(hexString))
let red = CGFloat((hexint & 0xff0000) >> 16) / 255.0
let green = CGFloat((hexint & 0xff00) >> 8) / 255.0
let blue = CGFloat((hexint & 0xff) >> 0) / 255.0
// Create color object, specifying alpha as well
let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
return color
}
func intFromHexString(hexStr: String) -> UInt32 {
var hexInt: UInt32 = 0
// Create scanner
let scanner: NSScanner = NSScanner(string: hexStr)
// Tell scanner to skip the # character
scanner.charactersToBeSkipped = NSCharacterSet(charactersInString: "#")
// Scan hex value
scanner.scanHexInt(&hexInt)
return hexInt
}
Swift 4+
Using the same logic with changes applied for swift 4,
func colorWithHexString(hexString: String, alpha:CGFloat = 1.0) -> UIColor {
// Convert hex string to an integer
let hexint = Int(self.intFromHexString(hexStr: hexString))
let red = CGFloat((hexint & 0xff0000) >> 16) / 255.0
let green = CGFloat((hexint & 0xff00) >> 8) / 255.0
let blue = CGFloat((hexint & 0xff) >> 0) / 255.0
// Create color object, specifying alpha as well
let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
return color
}
func intFromHexString(hexStr: String) -> UInt32 {
var hexInt: UInt32 = 0
// Create scanner
let scanner: Scanner = Scanner(string: hexStr)
// Tell scanner to skip the # character
scanner.charactersToBeSkipped = CharacterSet(charactersIn: "#")
// Scan hex value
scanner.scanHexInt32(&hexInt)
return hexInt
}
Swift 5 (iOS 13)+
The following shows an update that works given the SDK deprecation of scanHexInt32
. I've wrapped the code into a Swift playground file.
//: A UIKit based Playground for presenting user interface
import UIKit
import PlaygroundSupport
class MyViewController : UIViewController {
override func loadView() {
let view = UIView()
view.backgroundColor = .white
let label = UILabel()
label.frame = CGRect(x: 150, y: 200, width: 200, height: 20)
label.text = "Hello World!"
label.textColor = colorWithHexString(hexString: "22F728")
view.addSubview(label)
self.view = view
}
func colorWithHexString(hexString: String, alpha:CGFloat = 1.0) -> UIColor {
// Convert hex string to an integer
let hexint = Int(self.intFromHexString(hexStr: hexString))
let red = CGFloat((hexint & 0xff0000) >> 16) / 255.0
let green = CGFloat((hexint & 0xff00) >> 8) / 255.0
let blue = CGFloat((hexint & 0xff) >> 0) / 255.0
// Create color object, specifying alpha as well
let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
return color
}
func intFromHexString(hexStr: String) -> UInt32 {
var hexInt: UInt32 = 0
// Create scanner
let scanner: Scanner = Scanner(string: hexStr)
// Tell scanner to skip the # character
scanner.charactersToBeSkipped = CharacterSet(charactersIn: "#")
// Scan hex value
hexInt = UInt32(bitPattern: scanner.scanInt32(representation: .hexadecimal) ?? 0)
return hexInt
}
}
// Present the view controller in the Live View window
PlaygroundPage.current.liveView = MyViewController()
Color Hex References

- 12,895
- 5
- 82
- 100
-
The Swift snippets posted here seem to misunderstand the purpose of optionals in Swift, which is to contain values which may never exist. The question to ask in whether a parameter needs to be an optional is whether someone may need the ability to set it to nil. Does it possibly make sense for `alpha` to *ever* be set to nil? Because this method gives people that ability, and if someone should decide to set `alpha` to nil, the forced unwrapping of that optional will invariably lead to a crash. I haven't edited it out, though, in case there's some justification of which I'm not aware. – Jonathan Thornton Feb 10 '19 at 09:09
-
-
Working on a legacy project and the Objective-C solution works very well... except... oddly... @"#ffc107" and @"#e040fb" refuse to cooperate! Thoughts? – Nathaniel Oct 29 '19 at 03:52
-
-
1
-
@Nathaniel - Tried both of those color values with the latest code and they seem to work as planned. – Tommie C. Nov 25 '21 at 22:40
This is a function that takes a hex string and returns a UIColor.
(You can enter hex strings with either format: #ffffff
or ffffff
)
Usage:
var color1 = hexStringToUIColor("#d3d3d3")
Swift 4:
func hexStringToUIColor (hex:String) -> UIColor {
var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
if (cString.hasPrefix("#")) {
cString.remove(at: cString.startIndex)
}
if ((cString.count) != 6) {
return UIColor.gray
}
var rgbValue:UInt32 = 0
Scanner(string: cString).scanHexInt32(&rgbValue)
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
Swift 3:
func hexStringToUIColor (hex:String) -> UIColor {
var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
if (cString.hasPrefix("#")) {
cString.remove(at: cString.startIndex)
}
if ((cString.characters.count) != 6) {
return UIColor.gray
}
var rgbValue:UInt32 = 0
Scanner(string: cString).scanHexInt32(&rgbValue)
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
Swift 2:
func hexStringToUIColor (hex:String) -> UIColor {
var cString:String = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet() as NSCharacterSet).uppercaseString
if (cString.hasPrefix("#")) {
cString = cString.substringFromIndex(cString.startIndex.advancedBy(1))
}
if ((cString.characters.count) != 6) {
return UIColor.grayColor()
}
var rgbValue:UInt32 = 0
NSScanner(string: cString).scanHexInt(&rgbValue)
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
Source: arshad/gist:de147c42d7b3063ef7bc

- 7,849
- 3
- 24
- 29
-
I've seen in SVG, where there is a small version of the hex string with 3 characters, like #F0F. – Glenn Howes May 18 '16 at 12:21
-
That is shorthand notation, where '#F0F' is equivalent to '#FF00FF'. It would be simple to write a function that checked for shorthand and expanded it. – Ethan Strider May 18 '16 at 13:09
Use this Category :
in the file UIColor+Hexadecimal.h
#import <UIKit/UIKit.h>
@interface UIColor(Hexadecimal)
+ (UIColor *)colorWithHexString:(NSString *)hexString;
@end
in the file UIColor+Hexadecimal.m
#import "UIColor+Hexadecimal.h"
@implementation UIColor(Hexadecimal)
+ (UIColor *)colorWithHexString:(NSString *)hexString {
unsigned rgbValue = 0;
NSScanner *scanner = [NSScanner scannerWithString:hexString];
[scanner setScanLocation:1]; // bypass '#' character
[scanner scanHexInt:&rgbValue];
return [UIColor colorWithRed:((rgbValue & 0xFF0000) >> 16)/255.0 green:((rgbValue & 0xFF00) >> 8)/255.0 blue:(rgbValue & 0xFF)/255.0 alpha:1.0];
}
@end
In Class you want use it :
#import "UIColor+Hexadecimal.h"
and:
[UIColor colorWithHexString:@"#6e4b4b"];

- 2,535
- 1
- 28
- 35

- 9,801
- 13
- 66
- 84
You can make a extension like this
extension UIColor {
convenience init(hex: UInt, alpha: CGFloat = 1) {
self.init(
red: CGFloat((hex & 0xFF0000) >> 16) / 255.0,
green: CGFloat((hex & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(hex & 0x0000FF) / 255.0,
alpha: alpha
)
}
}
And use it anywhere like this
let color1 = UIColor(hex: 0xffffff)
let color2 = UIColor(hex: 0xffffff, alpha: 0.2)

- 24,434
- 8
- 68
- 125

- 1,759
- 1
- 14
- 22
-
-
-
-
@mah This is exactly the question i think, how to create a UIColor from hexstring – Manu Gupta Feb 27 '19 at 10:19
-
@ManuGupta you might notice a complete lack of any string handling in this answer. A hex number is not a hex string. The question explicitly states a string and though it doesn't have quotes in it, `#00FF00` is clearly intended to be a character string. As others have said it's simple, concise. But if it doesn't deal with strings it can't possibly answer a question asking how to deal with strings. – mah Feb 28 '19 at 15:42
A great Swift implementation (updated for Xcode 7) using extensions, pulled together from a variety of different answers and places. You will also need the string extensions at the end.
Use:
let hexColor = UIColor(hex: "#00FF00")
NOTE: I added an option for 2 additional digits to the end of the standard 6 digit hex value for an alpha channel (pass in value of 00
-99
). If this offends you, just remove it. You could implement it to pass in an optional alpha parameter.
Extension:
extension UIColor {
convenience init(var hex: String) {
var alpha: Float = 100
let hexLength = hex.characters.count
if !(hexLength == 7 || hexLength == 9) {
// A hex must be either 7 or 9 characters (#RRGGBBAA)
print("improper call to 'colorFromHex', hex length must be 7 or 9 chars (#GGRRBBAA)")
self.init(white: 0, alpha: 1)
return
}
if hexLength == 9 {
// Note: this uses String subscripts as given below
alpha = hex[7...8].floatValue
hex = hex[0...6]
}
// Establishing the rgb color
var rgb: UInt32 = 0
let s: NSScanner = NSScanner(string: hex)
// Setting the scan location to ignore the leading `#`
s.scanLocation = 1
// Scanning the int into the rgb colors
s.scanHexInt(&rgb)
// Creating the UIColor from hex int
self.init(
red: CGFloat((rgb & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgb & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgb & 0x0000FF) / 255.0,
alpha: CGFloat(alpha / 100)
)
}
}
String extensions:
Float source
Subscript source
extension String {
/**
Returns the float value of a string
*/
var floatValue: Float {
return (self as NSString).floatValue
}
/**
Subscript to allow for quick String substrings ["Hello"][0...1] = "He"
*/
subscript (r: Range<Int>) -> String {
get {
let start = self.startIndex.advancedBy(r.startIndex)
let end = self.startIndex.advancedBy(r.endIndex - 1)
return self.substringWithRange(start..<end)
}
}
}
-
Are you also using a String extension to get its subscript? eg http://stackoverflow.com/a/24144365/3220708 – Craig Grummitt Mar 24 '15 at 20:29
-
@CraigGrummitt oh my! Haha, yes. I have a decent compiled list of extensions and subscripts I sometimes (sadly) forget what is and is not included in the standard language feature set. I updated my answer including the source you gave. Not sure if I even got it from there but it sure looks close. – Firo Mar 24 '15 at 20:49
-
You might want to mention that it's a String extension. Also you seem to have missed String floatValue extension: http://stackoverflow.com/a/24088249/3220708 Other than that, good work! – Craig Grummitt Mar 24 '15 at 20:57
-
-
`countElements()` was replaced with `count()` in Swift 1.2, it is built into the language. I updated my answer to reflect that. – Firo Apr 25 '15 at 22:08
-
You know what? There is a bug in your code! In the first commented line and in the next one, you should replace `#GGRRBBAA` by `#RRGGBBAA`, should not you? ;-) – duthen Mar 25 '16 at 23:17
-
You are correct. I have fixed the comment. Thanks for pointing it out (feel free to edit things like that). – Firo Mar 26 '16 at 20:30
There is no builtin conversion from a hexadecimal string to a UIColor
(or CGColor
) that I'm aware of. However, you can easily write a couple of functions for this purpose - for example, see iphone development accessing uicolor components

- 23,586
- 12
- 103
- 136

- 48,938
- 12
- 131
- 152
-
3+1 If you scroll way down, the method in question is +colorWithHexString:. – Rob Napier Oct 13 '09 at 13:29
-
1@RobNapier `+colorWithHexString:` doesn't work. At least in my case. :) – Pawan Sharma Dec 06 '12 at 09:53
extension UIColor {
convenience init(hexaString: String, alpha: CGFloat = 1) {
let chars = Array(hexaString.dropFirst())
self.init(red: .init(strtoul(String(chars[0...1]),nil,16))/255,
green: .init(strtoul(String(chars[2...3]),nil,16))/255,
blue: .init(strtoul(String(chars[4...5]),nil,16))/255,
alpha: alpha)}
}
Usage:
let redColor = UIColor(hexaString: "#FF0000") // r 1,0 g 0,0 b 0,0 a 1,0
let transparentRed = UIColor(hexaString: "#FF0000", alpha: 0.5) // r 1,0 g 0,0 b 0,0 a 0,5
Another option is to convert the hexavalue to an unsigned integer and extract the corresponding values from it:
extension UIColor {
convenience init(hexaString: String, alpha: CGFloat = 1) {
self.init(hexa: UInt(hexaString.dropFirst(), radix: 16) ?? 0, alpha: alpha)
}
convenience init(hexa: UInt, alpha: CGFloat = 1) {
self.init(red: .init((hexa & 0xff0000) >> 16) / 255,
green: .init((hexa & 0xff00 ) >> 8) / 255,
blue: .init( hexa & 0xff ) / 255,
alpha: alpha)
}
}
let purpleColor = UIColor(hexaString: "#FF00FF") // r 1,0 g 0,0 b 1,0 a 1,0
let transparentYellow = UIColor(hexaString: "#FFFF00", alpha: 0.5) // r 1,0 g 1,0 b 0,0 a 0,5

- 229,809
- 59
- 489
- 571
I found a good UIColor
category for this, UIColor+PXExtensions.
Usage: UIColor *mycolor = [UIColor pxColorWithHexValue:@"#BADA55"];
And, just in case the link to my gist fails, here is the actual implementation code:
//
// UIColor+PXExtensions.m
//
#import "UIColor+UIColor_PXExtensions.h"
@implementation UIColor (UIColor_PXExtensions)
+ (UIColor*)pxColorWithHexValue:(NSString*)hexValue
{
//Default
UIColor *defaultResult = [UIColor blackColor];
//Strip prefixed # hash
if ([hexValue hasPrefix:@"#"] && [hexValue length] > 1) {
hexValue = [hexValue substringFromIndex:1];
}
//Determine if 3 or 6 digits
NSUInteger componentLength = 0;
if ([hexValue length] == 3)
{
componentLength = 1;
}
else if ([hexValue length] == 6)
{
componentLength = 2;
}
else
{
return defaultResult;
}
BOOL isValid = YES;
CGFloat components[3];
//Seperate the R,G,B values
for (NSUInteger i = 0; i < 3; i++) {
NSString *component = [hexValue substringWithRange:NSMakeRange(componentLength * i, componentLength)];
if (componentLength == 1) {
component = [component stringByAppendingString:component];
}
NSScanner *scanner = [NSScanner scannerWithString:component];
unsigned int value;
isValid &= [scanner scanHexInt:&value];
components[i] = (CGFloat)value / 256.0f;
}
if (!isValid) {
return defaultResult;
}
return [UIColor colorWithRed:components[0]
green:components[1]
blue:components[2]
alpha:1.0];
}
@end

- 23,586
- 12
- 103
- 136

- 13,850
- 6
- 71
- 90
swift version. Use as a Function or an Extension.
Function func UIColorFromRGB(colorCode: String, alpha: Float = 1.0) -> UIColor{
var scanner = NSScanner(string:colorCode)
var color:UInt32 = 0;
scanner.scanHexInt(&color)
let mask = 0x000000FF
let r = CGFloat(Float(Int(color >> 16) & mask)/255.0)
let g = CGFloat(Float(Int(color >> 8) & mask)/255.0)
let b = CGFloat(Float(Int(color) & mask)/255.0)
return UIColor(red: r, green: g, blue: b, alpha: CGFloat(alpha))
}
Extension
extension UIColor {
convenience init(colorCode: String, alpha: Float = 1.0){
var scanner = NSScanner(string:colorCode)
var color:UInt32 = 0;
scanner.scanHexInt(&color)
let mask = 0x000000FF
let r = CGFloat(Float(Int(color >> 16) & mask)/255.0)
let g = CGFloat(Float(Int(color >> 8) & mask)/255.0)
let b = CGFloat(Float(Int(color) & mask)/255.0)
self.init(red: r, green: g, blue: b, alpha: CGFloat(alpha))
}
}
How to call
let hexColorFromFunction = UIColorFromRGB("F4C124", alpha: 1.0)
let hexColorFromExtension = UIColor(colorCode: "F4C124", alpha: 1.0)
You can also define your Hex Color
from interface builder.

- 1
- 1

- 8,614
- 7
- 64
- 88
SWIFT 4
You can create a nice convenience
constructor in the extension like this:
extension UIColor {
convenience init(hexString: String, alpha: CGFloat = 1.0) {
var hexInt: UInt32 = 0
let scanner = Scanner(string: hexString)
scanner.charactersToBeSkipped = CharacterSet(charactersIn: "#")
scanner.scanHexInt32(&hexInt)
let red = CGFloat((hexInt & 0xff0000) >> 16) / 255.0
let green = CGFloat((hexInt & 0xff00) >> 8) / 255.0
let blue = CGFloat((hexInt & 0xff) >> 0) / 255.0
let alpha = alpha
self.init(red: red, green: green, blue: blue, alpha: alpha)
}
}
And use it later like
let color = UIColor(hexString: "#AABBCCDD")

- 1,953
- 19
- 16
This is another alternative.
- (UIColor *)colorWithRGBHex:(UInt32)hex
{
int r = (hex >> 16) & 0xFF;
int g = (hex >> 8) & 0xFF;
int b = (hex) & 0xFF;
return [UIColor colorWithRed:r / 255.0f
green:g / 255.0f
blue:b / 255.0f
alpha:1.0f];
}
-
1alternative answer to an alternative question... not a reasonable answer to the question on this page though. – mah Nov 21 '18 at 18:47
You could use various online tools to convert a HEX string to an actual UIColor. Check out uicolor.org or UI Color Picker. The output would be converted into Objective-C code, like:
[UIColor colorWithRed:0.93 green:0.80 blue:0.80 alpha:1.0];
Which you could embed in your application. Hope this helps!

- 23,770
- 8
- 136
- 129
-
Another online tool, same name in fact, [UI Color Picker](http://iachieved.it/uicolorpicker.html). – Joe Dec 18 '13 at 04:59
-
Generally when people ask for help with code to solve a fairly simple problem like this, an answer that says "first go to some online site..." is really not even close to being the answer the asker wanted. – mah Nov 21 '18 at 18:49
This is nice with cocoapod support
https://github.com/mRs-/HexColors
// with hash
NSColor *colorWithHex = [NSColor colorWithHexString:@"#ff8942" alpha:1];
// wihtout hash
NSColor *secondColorWithHex = [NSColor colorWithHexString:@"ff8942" alpha:1];
// short handling
NSColor *shortColorWithHex = [NSColor colorWithHexString:@"fff" alpha:1]

- 6,774
- 2
- 28
- 22
Another version with alpha
#define UIColorFromRGBA(rgbValue) [UIColor colorWithRed:((float)((rgbValue & 0xFF000000) >> 24))/255.0 green:((float)((rgbValue & 0xFF0000) >> 16))/255.0 blue:((float)((rgbValue & 0xFF00) >> 8 ))/255.0 alpha:((float)((rgbValue & 0xFF))/255.0)]

- 1,170
- 1
- 11
- 17
Swift equivalent of @Tom's answer, although receiving RGBA Int value to support transparency:
func colorWithHex(aHex: UInt) -> UIColor
{
return UIColor(red: CGFloat((aHex & 0xFF000000) >> 24) / 255,
green: CGFloat((aHex & 0x00FF0000) >> 16) / 255,
blue: CGFloat((aHex & 0x0000FF00) >> 8) / 255,
alpha: CGFloat((aHex & 0x000000FF) >> 0) / 255)
}
//usage
var color = colorWithHex(0x7F00FFFF)
And if you want to be able to use it from string you could use strtoul:
var hexString = "0x7F00FFFF"
let num = strtoul(hexString, nil, 16)
var colorFromString = colorWithHex(num)

- 6,235
- 3
- 46
- 55
Here's a Swift 1.2
version written as an extension to UIColor
. This allows you to do
let redColor = UIColor(hex: "#FF0000")
Which I feel is the most natural way of doing it.
extension UIColor {
// Initialiser for strings of format '#_RED_GREEN_BLUE_'
convenience init(hex: String) {
let redRange = Range<String.Index>(start: hex.startIndex.advancedBy(1), end: hex.startIndex.advancedBy(3))
let greenRange = Range<String.Index>(start: hex.startIndex.advancedBy(3), end: hex.startIndex.advancedBy(5))
let blueRange = Range<String.Index>(start: hex.startIndex.advancedBy(5), end: hex.startIndex.advancedBy(7))
var red : UInt32 = 0
var green : UInt32 = 0
var blue : UInt32 = 0
NSScanner(string: hex.substringWithRange(redRange)).scanHexInt(&red)
NSScanner(string: hex.substringWithRange(greenRange)).scanHexInt(&green)
NSScanner(string: hex.substringWithRange(blueRange)).scanHexInt(&blue)
self.init(
red: CGFloat(red) / 255,
green: CGFloat(green) / 255,
blue: CGFloat(blue) / 255,
alpha: 1
)
}
}

- 16,795
- 10
- 53
- 99
-
In Xcode 6.3.2 on the line that starts with `let greenRange = ...` I get an exception: `fatal error: can not increment endIndex` – Clifton Labrum Jun 23 '15 at 21:22
-
@CliftonLabrum I've tested this on Xcode 7 beta 3, and it works the same. Are you still having this issue? – Morgan Wilde Jul 20 '15 at 20:03
Swift 5, iOS 14
convenience init(hex: String, alpha: CGFloat = 1.0) {
var hexFormatted: String = hex.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines).uppercased()
if hexFormatted.hasPrefix("#") {
hexFormatted = String(hexFormatted.dropFirst())
}
assert(hexFormatted.count == 6, "Invalid hex code used.")
var rgbValue: UInt64 = 0
Scanner(string: hexFormatted).scanHexInt64(&rgbValue)
self.init(red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: alpha)
}

- 857
- 1
- 11
- 25
Another implementation allowing strings like "FFF"
or "FFFFFF"
and using alpha:
+ (UIColor *) colorFromHexString:(NSString *)hexString alpha: (CGFloat)alpha{
NSString *cleanString = [hexString stringByReplacingOccurrencesOfString:@"#" withString:@""];
if([cleanString length] == 3) {
cleanString = [NSString stringWithFormat:@"%@%@%@%@%@%@",
[cleanString substringWithRange:NSMakeRange(0, 1)],[cleanString substringWithRange:NSMakeRange(0, 1)],
[cleanString substringWithRange:NSMakeRange(1, 1)],[cleanString substringWithRange:NSMakeRange(1, 1)],
[cleanString substringWithRange:NSMakeRange(2, 1)],[cleanString substringWithRange:NSMakeRange(2, 1)]];
}
if([cleanString length] == 6) {
cleanString = [cleanString stringByAppendingString:@"ff"];
}
unsigned int baseValue;
[[NSScanner scannerWithString:cleanString] scanHexInt:&baseValue];
float red = ((baseValue >> 24) & 0xFF)/255.0f;
float green = ((baseValue >> 16) & 0xFF)/255.0f;
float blue = ((baseValue >> 8) & 0xFF)/255.0f;
return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}

- 4,558
- 3
- 32
- 49
I ended up creating a category for UIColor
that I can just reuse in my other projects, and added this function:
+ (UIColor *)colorFromHex:(unsigned long)hex
{
return [UIColor colorWithRed:((float)((hex & 0xFF0000) >> 16))/255.0
green:((float)((hex & 0xFF00) >> 8))/255.0
blue:((float)(hex & 0xFF))/255.0
alpha:1.0];
}
The usage goes like:
UIColor *customRedColor = [UIColor colorFromHex:0x990000];
This is far faster than passing on a string and converting it to a number then shifting the bits.
You can also import the category from inside your .pch
file so you can easily use colorFromHex
everywhere in your app like it's built-in to UIColor
:
#ifdef __OBJC__
#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
// Your other stuff here...
#import "UIColor+HexColor.h"
#endif

- 13,385
- 12
- 87
- 132
-
1At first I liked and tried the #define approach above. Yet like most defines with many () it was hard to extend and debug. I then fell back to the "Utilities" class method approach. This works but it introduces a new class name into the namespace. Then, I saw your posting and I like it a lot because it understands how to use the Objective-C language. Good show. I plan on making a similar solution that takes RGB decimal values (eg. red: 24 green: 104 blue: 255) – Bryan Feb 11 '14 at 06:22
updated for swift 1.2
class func colorWithHexString (hex:String) -> UIColor {
var cString: NSString = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString
if (cString.hasPrefix("#")) {
cString = cString.substringFromIndex(1)
}
if (count(cString as String) != 6) {
return UIColor.grayColor()
}
var rString: String = cString.substringToIndex(2)
var gString: String = (cString.substringFromIndex(2) as NSString).substringToIndex(2)
var bString: String = (cString.substringFromIndex(4) as NSString).substringToIndex(2)
var r:CUnsignedInt = 0, g:CUnsignedInt = 0, b:CUnsignedInt = 0;
NSScanner(string: rString).scanHexInt(&r)
NSScanner(string: gString).scanHexInt(&g)
NSScanner(string: bString).scanHexInt(&b)
return UIColor(red: CGFloat(Float(r) / 255.0), green: CGFloat(Float(g) / 255.0), blue: CGFloat(Float(b) / 255.0), alpha: CGFloat(1))
}

- 1,286
- 10
- 21
Create elegant extension for UIColor
:
extension UIColor {
convenience init(string: String) {
var uppercasedString = string.uppercased()
uppercasedString.remove(at: string.startIndex)
var rgbValue: UInt32 = 0
Scanner(string: uppercasedString).scanHexInt32(&rgbValue)
let red = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
let green = CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0
let blue = CGFloat(rgbValue & 0x0000FF) / 255.0
self.init(red: red, green: green, blue: blue, alpha: 1)
}
}
Create red color:
let red = UIColor(string: "#ff0000")

- 59,234
- 49
- 233
- 358
extension UIColor
{
class func fromHexaString(hex:String) -> UIColor
{
let scanner = Scanner(string: hex)
scanner.scanLocation = 0
var rgbValue: UInt64 = 0
scanner.scanHexInt64(&rgbValue)
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
}
//you can call like this.
UIColor.fromHexaString(hex:3276b1)

- 329
- 1
- 3
- 10
self.view.backgroundColor = colorWithHex(hex: yourColorCode)
- Code for creating Color from
hexaDecimalCode
func colorWithHex (hex:String) -> UIColor {
var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
if (cString.hasPrefix("#")) {
cString.remove(at: cString.startIndex)
}
if ((cString.count) != 6) {
return UIColor.gray
}
var rgbValue:UInt32 = 0
Scanner(string: cString).scanHexInt32(&rgbValue)
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}

- 1,559
- 2
- 14
- 31
You Can Get UIColor From String Code Like
circularSpinner.fillColor = [self getUIColorObjectFromHexString:@"27b8c8" alpha:9];
//Function For Hex Color Use
- (unsigned int)intFromHexString:(NSString *)hexStr
{
unsigned int hexInt = 0;
// Create scanner
NSScanner *scanner = [NSScanner scannerWithString:hexStr];
// Tell scanner to skip the # character
[scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:@"#"]];
// Scan hex value
[scanner scanHexInt:&hexInt];
return hexInt;
}
- (UIColor *)getUIColorObjectFromHexString:(NSString *)hexStr alpha:(CGFloat)alpha
{
// Convert hex string to an integer
unsigned int hexint = [self intFromHexString:hexStr];
// Create color object, specifying alpha as well
UIColor *color =
[UIColor colorWithRed:((CGFloat) ((hexint & 0xFF0000) >> 16))/255
green:((CGFloat) ((hexint & 0xFF00) >> 8))/255
blue:((CGFloat) (hexint & 0xFF))/255
alpha:alpha];
return color;
}
/Function For Hex Color Use
- (unsigned int)intFromHexString:(NSString *)hexStr
{
unsigned int hexInt = 0;
// Create scanner
NSScanner *scanner = [NSScanner scannerWithString:hexStr];
// Tell scanner to skip the # character
[scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:@"#"]];
// Scan hex value
[scanner scanHexInt:&hexInt];
return hexInt;
}
- (UIColor *)getUIColorObjectFromHexString:(NSString *)hexStr alpha:(CGFloat)alpha
{
// Convert hex string to an integer
unsigned int hexint = [self intFromHexString:hexStr];
// Create color object, specifying alpha as well
UIColor *color =
[UIColor colorWithRed:((CGFloat) ((hexint & 0xFF0000) >> 16))/255
green:((CGFloat) ((hexint & 0xFF00) >> 8))/255
blue:((CGFloat) (hexint & 0xFF))/255
alpha:alpha];
return color;
}

- 67
- 5
-
if u want use this code,then u call [self getUIColorObjectFromHexString:@"27b8c8" alpha:9]; only.... – Manish Saini Apr 29 '14 at 11:17
I like to ensure the alpha besides the color, so i write my own category
+ (UIColor *) colorWithHex:(int)color {
float red = (color & 0xff000000) >> 24;
float green = (color & 0x00ff0000) >> 16;
float blue = (color & 0x0000ff00) >> 8;
float alpha = (color & 0x000000ff);
return [UIColor colorWithRed:red/255.0 green:green/255.0 blue:blue/255.0 alpha:alpha/255.0];
}
easy to use like this
[UIColor colorWithHex:0xFF0000FF]; //Red
[UIColor colorWithHex:0x00FF00FF]; //Green
[UIColor colorWithHex:0x00FF00FF]; //Blue
[UIColor colorWithHex:0x0000007F]; //transparent black

- 5,977
- 2
- 33
- 40
I created a convenience init for that:
extension UIColor {
convenience init(hex: String, alpha: CGFloat)
{
let redH = CGFloat(strtoul(hex.substringToIndex(advance(hex.startIndex,2)), nil, 16))
let greenH = CGFloat(strtoul(hex.substringWithRange(Range<String.Index>(start: advance(hex.startIndex, 2), end: advance(hex.startIndex, 4))), nil, 16))
let blueH = CGFloat(strtoul(hex.substringFromIndex(advance(hex.startIndex,4)), nil, 16))
self.init(red: redH/255, green: greenH/255, blue: blueH/255, alpha: alpha)
}
}
then you can create an UIColor anywhere in your project just like this:
UIColor(hex: "ffe3c8", alpha: 1)
hope this helps...

- 116
- 4
You can create extension class of UIColor as:-
extension UIColor {
// MARK: - getColorFromHex /** This function will convert the color Hex code to RGB.
- parameter color hex string.
- returns: RGB color code.
*/
class func getColorFromHex(hexString:String)->UIColor{
var rgbValue : UInt32 = 0
let scanner:NSScanner = NSScanner(string: hexString)
scanner.scanLocation = 1
scanner.scanHexInt(&rgbValue)
return UIColor(red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0, green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0, blue: CGFloat(rgbValue & 0x0000FF) / 255.0, alpha: CGFloat(1.0))
}
}

- 137
- 1
- 6
For swift 2.0+. This code works fine to me.
extension UIColor {
/// UIColor(hexString: "#cc0000")
internal convenience init?(hexString:String) {
guard hexString.characters[hexString.startIndex] == Character("#") else {
return nil
}
guard hexString.characters.count == "#000000".characters.count else {
return nil
}
let digits = hexString.substringFromIndex(hexString.startIndex.advancedBy(1))
guard Int(digits,radix:16) != nil else{
return nil
}
let red = digits.substringToIndex(digits.startIndex.advancedBy(2))
let green = digits.substringWithRange(Range<String.Index>(start: digits.startIndex.advancedBy(2),
end: digits.startIndex.advancedBy(4)))
let blue = digits.substringWithRange(Range<String.Index>(start:digits.startIndex.advancedBy(4),
end:digits.startIndex.advancedBy(6)))
let redf = CGFloat(Double(Int(red, radix:16)!) / 255.0)
let greenf = CGFloat(Double(Int(green, radix:16)!) / 255.0)
let bluef = CGFloat(Double(Int(blue, radix:16)!) / 255.0)
self.init(red: redf, green: greenf, blue: bluef, alpha: CGFloat(1.0))
}
}
This code includes string format checking. e.g.
let aColor = UIColor(hexString: "#dadada")!
let failed = UIColor(hexString: "123zzzz")
And as far as I know, my code is of no disadvantage for its maintaining the semantic of failible condition and returning an optional value. And this should be the best answer.

- 1,286
- 1
- 11
- 29
Swift 2.0 - Xcode 7.2
Adding an extension to UIColor.
File -New - Swift File - Name it . Add the following.
extension UIColor {
convenience init(hexString:String) {
let hexString:NSString = hexString.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet())
let scanner = NSScanner(string: hexString as String)
if (hexString.hasPrefix("#")) {
scanner.scanLocation = 1
}
var color:UInt32 = 0
scanner.scanHexInt(&color)
let mask = 0x000000FF
let r = Int(color >> 16) & mask
let g = Int(color >> 8) & mask
let b = Int(color) & mask
let red = CGFloat(r) / 255.0
let green = CGFloat(g) / 255.0
let blue = CGFloat(b) / 255.0
self.init(red:red, green:green, blue:blue, alpha:1)
}
func toHexString() -> String {
var r:CGFloat = 0
var g:CGFloat = 0
var b:CGFloat = 0
var a:CGFloat = 0
getRed(&r, green: &g, blue: &b, alpha: &a)
let rgb:Int = (Int)(r*255)<<16 | (Int)(g*255)<<8 | (Int)(b*255)<<0
return NSString(format:"#%06x", rgb) as String
}
}
Usage:
Ex. Setting Button's color from hexCode.
override func viewWillAppear(animated: Bool) {
loginButton.tintColor = UIColor(hexString: " hex code here ")
}
Ex. Converting Button's current color to hex Code.
override func viewWillAppear(animated: Bool) {
let hexString = loginButton.tintColor.toHexString()
print("HEX STRING: \(hexString)")
}

- 14,148
- 92
- 64
Swift 2.0 version of solution which will handle alpha value of color and with perfect error handling is here:
func RGBColor(hexColorStr : String) -> UIColor?{
var red:CGFloat = 0.0
var green:CGFloat = 0.0
var blue:CGFloat = 0.0
var alpha:CGFloat = 1.0
if hexColorStr.hasPrefix("#"){
let index = hexColorStr.startIndex.advancedBy(1)
let hex = hexColorStr.substringFromIndex(index)
let scanner = NSScanner(string: hex)
var hexValue: CUnsignedLongLong = 0
if scanner.scanHexLongLong(&hexValue)
{
if hex.characters.count == 6
{
red = CGFloat((hexValue & 0xFF0000) >> 16) / 255.0
green = CGFloat((hexValue & 0x00FF00) >> 8) / 255.0
blue = CGFloat(hexValue & 0x0000FF) / 255.0
}
else if hex.characters.count == 8
{
red = CGFloat((hexValue & 0xFF000000) >> 24) / 255.0
green = CGFloat((hexValue & 0x00FF0000) >> 16) / 255.0
blue = CGFloat((hexValue & 0x0000FF00) >> 8) / 255.0
alpha = CGFloat(hexValue & 0x000000FF) / 255.0
}
else
{
print("invalid hex code string, length should be 7 or 9", terminator: "")
return nil
}
}
else
{
print("scan hex error")
return nil
}
}
let color: UIColor = UIColor(red:CGFloat(red), green: CGFloat(green), blue:CGFloat(blue), alpha: alpha)
return color
}

- 3,486
- 26
- 34
//UIColorWithHexString
static UIColor * UIColorWithHexString(NSString *hex) {
unsigned int rgb = 0;
[[NSScanner scannerWithString:
[[hex uppercaseString] stringByTrimmingCharactersInSet:
[[NSCharacterSet characterSetWithCharactersInString:@"0123456789ABCDEF"] invertedSet]]]
scanHexInt:&rgb];
return [UIColor colorWithRed:((CGFloat)((rgb & 0xFF0000) >> 16)) / 255.0
green:((CGFloat)((rgb & 0xFF00) >> 8)) / 255.0
blue:((CGFloat)(rgb & 0xFF)) / 255.0
alpha:1.0];
}
Usage
self.view.backgroundColor = UIColorWithHexString(@"#0F35C0");

- 457
- 4
- 8
UIColor Hex initialization
extension UIColor{
public convenience init(hex : String) {
var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()
if (cString.hasPrefix("#")) {
cString.remove(at: cString.startIndex)
}
if ((cString.count) != 6) {
self.init(red: 1, green: 1, blue: 1, alpha: 1)
return
}
var rgbValue:UInt32 = 0
Scanner(string: cString).scanHexInt32(&rgbValue)
self.init(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
//Iniitailization
let myColor = UIColor(hex: "#452b4e")
Happy coding ! Enjoy !!!!!!

- 289
- 2
- 13
The most posted solutions uses Scanner
, but you don't really need it at lease in the modern Swift. Instead you can simple use UInt
init with radix 16 and then use basic binary operations to get UIColor
components:
func stringToColor(color: String) -> UIColor {
guard let i = UInt(color, radix: 16) else {
return UIColor.white
}
return UIColor(
red: CGFloat((i & 0xFF0000) >> 16) / 255.0,
green: CGFloat((i & 0xFF00) >> 8) / 255.0,
blue: CGFloat(i & 0xFF) / 255.0,
alpha: 1.0
)
}
This solution expects input like "FF00FF", you may need to remove leading hash symbol (#) if you have one in your string.

- 11,614
- 6
- 59
- 87
-
What should `color` be in `UInt(color, radix: 16)`? It cannot be found in scope. – hotdogsoup.nl May 07 '22 at 10:48
-
1I'm sorry, the `color` actually a function parameter name, I fixed the code above. Thanks. – Mike Keskinov May 13 '22 at 19:15
Swift version:
extension UIColor {
convenience init?(var hex: String) {
hex = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString
hex = (hex.hasPrefix("#")) ? hex.substringFromIndex(advance(hex.startIndex, 1)) : hex
var value: UInt32 = 0
if NSScanner(string: hex).scanHexInt(&value) {
if count(hex) == 8 {
self.init(red: CGFloat((value & 0xFF000000) >> 24) / 255.0,
green: CGFloat((value & 0x00FF0000) >> 16) / 255.0,
blue: CGFloat((value & 0x0000FF00) >> 8) / 255.0,
alpha: CGFloat((value & 0x000000FF)) / 255.0)
return
} else if count(hex) == 6 {
self.init(red: CGFloat((value & 0xFF0000) >> 16) / 255.0,
green: CGFloat((value & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(value & 0x0000FF) / 255.0,
alpha: 1.0)
return
}
}
self.init()
return nil
}
}

- 3,737
- 3
- 39
- 44
Try this: This code will return UIColor
from your hex color string
- (UIColor*)colorWithHexString:(NSString*)hex
{
NSString *cString = [[hex stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]] uppercaseString];
// String should be 6 or 8 characters
if ([cString length] < 6) return [UIColor grayColor];
// strip 0X if it appears
if ([cString hasPrefix:@"0X"]) cString = [cString substringFromIndex:2];
if ([cString length] != 6) return [UIColor grayColor];
// Separate into r, g, b substrings
NSRange range;
range.location = 0;
range.length = 2;
NSString *rString = [cString substringWithRange:range];
range.location = 2;
NSString *gString = [cString substringWithRange:range];
range.location = 4;
NSString *bString = [cString substringWithRange:range];
// Scan values
unsigned int r, g, b;
[[NSScanner scannerWithString:rString] scanHexInt:&r];
[[NSScanner scannerWithString:gString] scanHexInt:&g];
[[NSScanner scannerWithString:bString] scanHexInt:&b];
return [UIColor colorWithRed:((float) r / 255.0f)
green:((float) g / 255.0f)
blue:((float) b / 255.0f)
alpha:1.0f];
}

- 11,129
- 4
- 78
- 90
Convert hex color to RGB value using any converter website (if you google "hex to rgb", you'll see a ton). For example, this one: http://www.rgbtohex.net/hextorgb/
Then change the color property to UIColor. Example:
self.profilePicture.layer.borderColor = [UIColor colorWithRed:0 green:167 blue:142 alpha:1.0].CGColor;
Hex color value was: 00a78e converted to RGB: R: 0 G: 167 B: 142
If the RGB values you are giving are not between 0 and 1.0, you'll have to divide them by 255. Example:
self.profilePicture.layer.borderColor = [UIColor colorWithRed:83.00/255.0 green:123.00/255.0 blue:53.00/255.0 alpha:1.0].CGColor;

- 301
- 3
- 10
Polished Extension from the original answer by @Tom feel free to update the code here
extension UIColor{
convenience init (hexString:String) {
var cleanString:String = hexString.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString
if (cleanString.hasPrefix("#")) {
cleanString = cleanString.substringFromIndex(cleanString.startIndex.advancedBy(1))
}
if (cleanString.characters.count != 6) {
self.init()
}
else{
var rgbValue = UInt32()
let scanner = NSScanner(string: cleanString)
scanner.scanHexInt(&rgbValue)
self.init(
red: CGFloat((rgbValue & 0xFF0000) >> 16)/255.0,
green: CGFloat((rgbValue & 0xFF00) >> 8)/255.0,
blue: CGFloat(rgbValue & 0xFF)/255.0,
alpha: 1.0)
}
}
}

- 3,832
- 1
- 29
- 30
-
This is pretty good, but I think it really needs to return an optional UIColor, and check to see if scanHexInt fails (returns boolean). If I pass in a HexString I'm not sure I want to get back some oddly defined UIColor that may have only had .init() called with no real values! – Kendall Helmstetter Gelner Nov 30 '15 at 03:50
Swift 2.0:
Add this method to VC or to Extension of UIColor.
func colorWithHexString (hex:String) -> UIColor {
var cString:String = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString
if (cString.hasPrefix("#")) {
cString = (cString as NSString).substringFromIndex(1)
}
if (cString.characters.count != 6) {
return UIColor.grayColor()
}
let rString = (cString as NSString).substringToIndex(2)
let gString = ((cString as NSString).substringFromIndex(2) as NSString).substringToIndex(2)
let bString = ((cString as NSString).substringFromIndex(4) as NSString).substringToIndex(2)
var r:CUnsignedInt = 0, g:CUnsignedInt = 0, b:CUnsignedInt = 0;
NSScanner(string: rString).scanHexInt(&r)
NSScanner(string: gString).scanHexInt(&g)
NSScanner(string: bString).scanHexInt(&b)
return UIColor(red: CGFloat(r) / 255.0, green: CGFloat(g) / 255.0, blue: CGFloat(b) / 255.0, alpha: CGFloat(1))
}
Usage :
loginButton.tintColor = self.colorWithHexString("#be1337")
OR
let hexColor = self.colorWithHexString("#be1337")

- 14,148
- 92
- 64
You can use this library
https://github.com/burhanuddin353/TFTColor
Swift
UIColor.colorWithRGB(hexString: "FF34AE" alpha: 1.0)
Objective-C
[UIColor colorWithRGBHexString:@"FF34AE" alpha:1.0f]

- 5,318
- 3
- 25
- 51
Swift 3 example of Ethan Strider's answer. A function that takes a hex string and returns a UIColor.
(You can enter hex strings with either format: #ffffff
or ffffff
)
Example:
func hexStringToUIColor (hex:String) -> UIColor {
var cString: String = hex.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines).uppercased()
if (cString.hasPrefix("#")) {
if let range = cString.range(of: cString) {
cString = cString.substring(from: cString.index(range.lowerBound, offsetBy: 1))
}
}
if ((cString.characters.count) != 6) {
return UIColor.gray
}
var rgbValue: UInt32 = 0
Scanner(string: cString).scanHexInt32(&rgbValue)
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
Usage:
var color1 = hexStringToUIColor("#d3d3d3")

- 1
- 1

- 18,301
- 9
- 84
- 152
In Xamarin.iOS you can use the following instead of the macro:
public UIColor UIColorFromHexValue(int value, float alpha = 1f) =>
UIColor.FromRGBA(
((value & 0xFF0000) >> 16) / 255.0f,
((value & 0x00FF00) >> 16) / 255.0f,
((value & 0x0000FF) >> 16) / 255.0f,
alpha);
The usage is analogous:
label.TextColor = UIColorFromHexValue(0xBC1128);

- 38,440
- 7
- 70
- 91
Posting for reference a site I just found. It does all the dirty job and, starting from HEX or RGB, prints out the code in ObjC, Swift and Xamarin.

- 399
- 1
- 4
- 25
There is a nice UIColor category with many features in it.
Usage:
textView.textColor = [UIColor colorWithHexString:textColorHex];
NSLog(@"Text Color Hex: %@", textColorHex);
Where textColorHex has a form of @"FFFFFF" without # symbol.

- 253
- 1
- 8
- 21

- 15,320
- 6
- 84
- 70
Several above solutions involve somewhat unnecessary use of NSStrings. This UIColor class extension is bit simpler & faster:
+ colorWithHex:(UInt32)hex alpha:(CGFloat)alpha
{
return [UIColor colorWithRed:((hex & 0xFF0000) >> 16)/255.0
green:((hex & 0x00FF00) >> 8)/255.0
blue:( hex & 0x0000FF)/255.0
alpha:alpha];
}
and to use it simply:
return [UIColor colorWithHex:0x006400 alpha:1.0]; // HTML darkgreen

- 1,229
- 13
- 18
-
Downvote with no comment helps nobody. So,
, if you take issue with my specifically *not* using NSStrings, please note that by far-and-above the top-rated answer to this question, courtesy of Tom, does likewise. And IMHO a UIColor class extension is just a cleaner solution than a #define. The only possible reason I could think of needing to convert hex color codes *in an NSString* would be if you are trying to parse an HTML stream, at which point you should probably look into init'ing an NSAttributedString with a NSHTMLTextDocumentType, and let it tool handle *all* the markups. – tiritea Mar 13 '17 at 03:56
In swift I created a class extension with the following methods to convert a hex code to a UIColor.
extension UIColor {
convenience init(R: CGFloat, G: CGFloat, B: CGFloat, alpha: CGFloat) {
self.init(red: R/255.0, green: G/255.0, blue: B/255.0, alpha: alpha)
}
class func colorWithHex(hex: UInt, alpha: CGFloat) -> UIColor {
return UIColor(R: CGFloat((hex & 0xFF0000) >> 16), G: CGFloat((hex & 0x00FF00) >> 8), B: CGFloat(hex & 0x0000FF), alpha: alpha)
}
}

- 1,246
- 1
- 15
- 29
Use Xcode's native Color Literals feature to add hex colors easily and natively.
Type Color Literal
into your code and let Xcode autocomplete do the rest.
The color picker UI will allow you to paste in a Hex Color: #FF9300
The git diff of the macro will show RGB values rather than hex:
let orange = #colorLiteral(red: 1, green: 0.5763723254, blue: 0, alpha: 1)
But it's still an easy way to paste in hex without any 3rd party tools or extensions.

- 33,281
- 23
- 160
- 191
Most answers use Scanner
, bitmasks, or substring operations.
It's also possible to extract the color components using chunks(ofCount:)
and UInt8.init(_:radix:)
:
import Foundation
import XCTest
import Algorithms
class UIColorFromStringTests: XCTestCase {
func testUIColorFromString() throws {
let color = "00FF00"
let components = color
.chunks(ofCount: 2)
.compactMap { UInt8($0, radix: 16) }
XCTAssertEqual(components, [0x00, 0xFF, 0x00])
let red = CGFloat(components[0]) / 255 // 0
let green = CGFloat(components[1]) / 255 // 1
let blue = CGFloat(components[2]) / 255 // 0
_ = UIColor(red: red, green: green, blue: blue, alpha: 1)
}
}

- 2,488
- 3
- 23
- 42