NSUInteger index = [self.objects indexOfObject:obj];
if (index == NSNotFound) {
// Success! Note: NSNotFound internally uses NSIntegerMax
}
if (index == NSUIntegerMax) {
// Fails!
}
Why? I'm suppose to get an unsigned value as a result of indexOfObject. So, naturally, i was assuming that if the object is not found, it will return NSUIntegerMax instead of NSIntegerMax. Is this a bug, or is there a logical explanation for this behavior.