0

I have an unsigned int and I want to create UIColor object with that value. How can I do that ?

I am trying this code:

Unsigned int color=-65536;
UIColor linearcolor=[UIColor colorWithRed:(float)(color/255.0) green:(float)(color/255.0) blue:(float)(color/255.0) alpha:1.0];
Ali
  • 1,102
  • 5
  • 21
  • 39
iOS_Learner
  • 159
  • 9
  • possible duplicate of [how to convert rgb color to int in java](http://stackoverflow.com/questions/18022364/how-to-convert-rgb-color-to-int-in-java) – Ali Jan 23 '15 at 06:10
  • @Ali Thanks ... found solution here ... https://github.com/vilanovi/UIColor-Additions – iOS_Learner Jan 23 '15 at 06:35
  • you are using `Unsigned int` and assigning a `-` signed value to `int`. in `int` type the range is `0 to 65535` and for `unsigned int` the range is `-32768 to 32767` . so use another datatype like `float` or `double` @Farrukh – sreekanthk Jan 23 '15 at 07:13

1 Answers1

0

There is an example, I have not tested it, if somebody finds issues, leave me a message, thanks.

/*
 red:0~255
 green:0~255
 blue:0~255
 alpha:0~255
 */
- (int)createIntColorWithRed:(unsigned char)red
                       green:(unsigned char)green
                        blue:(unsigned char)blue
                       alpha:(unsigned char)alpha
{
    int color = 0 ;
    unsigned char *p = (unsigned char *)&color ;
    *p = red ;
    *(p+1) = green ;
    *(p+2) = blue ;
    *(p+3) = alpha ;
    return color ;
}

- (UIColor *)colorWithInt:(int)iColor
{
    unsigned char red = 0 ;
    unsigned char green = 0 ;
    unsigned char blue = 0 ;
    unsigned char alpha = 0 ;

    unsigned char *p = (unsigned char *)&iColor ;
    red = *p ;
    green = *(p+1) ;
    blue = *(p+2) ;
    alpha = *(p+3) ;
    UIColor *color = [UIColor colorWithRed:red/255.0 green:green/255.0 blue:blue/255.0 alpha:alpha/255.0] ;
    return color ;
}
KudoCC
  • 6,912
  • 1
  • 24
  • 53