I have an entity called "Color" that has R,G,B stored, and a subclass that returns saturation, UIColor, etc., as needed. I recently wrote some code where I needed the Color entity to be the Class "Color" so I could call a method on it.
But it seems that broke another part.
Before I store the colors, I use the Color class to figure out what colors to store, based on the methods in the Color class. This is where I'm running into problems.
Color *color = [[Color alloc] initWithColor:[UIColor whiteColor]];
if (color.saturation > 0.2) {
[self addOrIncrementColor:color];
}
At the if statement, the debugger shows:
color Color * 0x1f532740
NSManagedObject NSManagedObject
red CGFloat 0.392157
green CGFloat 0.443137
blue CGFloat 0.203922
count __NSCFNumber * 0x1f559d00
color UIDeviceRGBColor * 0x200e8f10
saturation CGFloat 0.539823
However, if I print the description of the Color object, I get:
Color: 0x1f532740 (entity: (null); id: (null) ; data: {})
This of course, passes the color object to the addOrIncrement: method with null info, even though it is set locally.
Any idea how I can get this to work?