16

Does anyone know why NextStep/Apple decided to take the "convenient method" of doing nothing when passing a Nil object a message, but the "Java method" of raising an exception when passing an instantiated object an invalid selector?

For example,

// This does "nothing"
NSObject *object = Nil;
[object thisDoesNothing];

object = [[NSObject alloc] init];
// This causes an NSInvalidArgumentException to be raised
[object thisThrowsAnException];

So on one hand, we have the convenience of not having to check for Nil (assuming we don't care too much about the result of the method call)--but on the other hand we have to check for an exception if our object doesn't respond to a method?

If I'm not sure if the object will respond, I either have to:

@try {
    [object thisThrowsAnException];
} @catch (NSException *e){
    // do something different with object, since we can't call thisThrowsAnException
}

Or,

if([object respondsToSelector:@selector(thisThrowsAnException)]) {
    [object thisThrowsAnException];
}
else {
    // do something different with object, since we can't call thisThrowsAnException
}

(The latter is probably the better way to do it, since if object is Nil, the selector would NOT raise an exception, thus your code might not behave the way you want it to).

My question is: WHY did Apple decide to implement it this way?
Why not have the unrecognized selector call to an instantiated object not raise an exception?
Alternatively, why not have the Nil object raise an exception if you try to call a method on it?

Jano
  • 62,815
  • 21
  • 164
  • 192
Mike Sprague
  • 3,567
  • 1
  • 22
  • 25
  • closely related: [Sending a message to nil](http://stackoverflow.com/questions/156395/sending-a-message-to-nil/) – jscs Jul 17 '12 at 20:19
  • 5
    While it is quite conceivable to turn an ability to send messages to `nil` to your advantage, sending a wrong message is nearly always an indication of a coding error. – Sergey Kalinichenko Jul 17 '12 at 20:53

3 Answers3

16

I can't fully answer your question, but I can answer part of it. Objective-C allows you to send a message to nil because it makes code more elegant. You can read about this design decision here, and I will steal its example:

Let's say you want to get the last phone number that some person dialed on her office phone. If you can't send messages to nil, you have to write it like this:

Office *office = [somePerson office];
// Person might not have an office, so check it...
if (office) {
    Telephone *phone = [office telephone];
    // The office might not have a telephone, so check it...
    if (phone) {
        NSString *lastNumberDialed = [phone lastNumberDialed];
        // The phone might be brand new, so there might be no last-dialed-number...
        if (lastNumberDialed) {
            // Use the number, for example...
            [myTextField setText:lastNumberDialed];
        }
    }
}

Now suppose you can send messages to nil (and always get nil back):

NSString *lastNumberDialed = [[[somePerson office] telephone] lastNumberDialed];
if (lastNumberDialed) {
    [myTextField setText:lastNumberDialed];
}

As for why sending an unrecognized selector to an object raises an exception: I don't know for sure. I suspect that it's far more common for this to be a bug than to be harmless. In my code, I only want an unrecognized selector to be silently ignored when I need to send an optional protocol message (e.g. sending an optional message to a delegate). So I want the system to treat it as an error, and let me be explicit in the relatively rare case when I don't want it to be an error.

Note that you can tinker (to some extent) with the handling of unrecognized selectors in your own classes, in a few different ways. Take a look at the forwardingTargetForSelector:, forwardInvocation:, doesNotRecognizeSelector:, and resolveInstanceMethod: methods of NSObject.

rob mayoff
  • 375,296
  • 67
  • 796
  • 848
  • The only good example I can think of where you would want to suppress the exception would be the case where you have some sort of polymorphism and you're not sure whether the object has the method and if it doesn't, you wouldn't do anything differently. i.e. this would usually be a case where you would cast the object then call the selector on it. This is pretty rare though... I can't think of a good practical example, but something like: if B inherits from A: `for(ClassA obj : listOfAs) { [obj specialmethodCallOnlyBHas]; // do stuff with obj: }` – Mike Sprague Jul 17 '12 at 22:07
  • 2
    Right. That's a case where you want to check with `respondsToSelector:` first, as it's much faster than catching an exception. Apple's classes cache the results of `respondsToSelector:` when you first set the delegate (or data source or whatever). – rob mayoff Jul 17 '12 at 22:17
  • an elegance we lost with Swift :-( – malhal Aug 07 '17 at 13:11
  • @malhal Don't we use `Optional Chaining` for this? – atulkhatri Jan 18 '18 at 04:53
5

From the good ol' documentation:

In Objective-C, it is valid to send a message to nil—it simply has no effect at runtime.

As for the other problem of the unrecognized selector behavior, an old implementation file of NSObject (from the MySTEP library) shows that the culprit is the NSObject method -doesNotRecognizeSelector:, which looks a bit as follows:

- (void) doesNotRecognizeSelector:(SEL)aSelector
{
    [NSException raise:NSInvalidArgumentException
                format:@"NSObject %@[%@ %@]: selector not recognized", 
                        object_is_instance(self)?@"-":@"+",
                        NSStringFromClass([self class]), 
                        NSStringFromSelector(aSelector)];
}

Which means that ObjC methods could feasibly be tinkered with so that they do not in fact have to raise an error. Which means the decision was entirely arbitrary, just like the decision to switch to "method-eating" messages to nil. A feat which can be done through method swizzling NSObject (wholly dangerous, as it will raise an EXC_BAD_ACCESS, or EXC_I386_BPT on mac, but at least it doesn't raise an exception)

void Swizzle(Class c, SEL orig, SEL new)
{
    Method origMethod = class_getInstanceMethod(c, orig);
    Method newMethod = class_getInstanceMethod(c, new);
    if(class_addMethod(c, orig, method_getImplementation(newMethod), method_getTypeEncoding(newMethod)))
        class_replaceMethod(c, new, method_getImplementation(origMethod), method_getTypeEncoding(origMethod));
    else
        method_exchangeImplementations(origMethod, newMethod);
}

-(void)example:(id)sender {
    Swizzle([NSObject class], @selector(doesNotRecognizeSelector:), @selector(description));
    [self performSelector:@selector(unrecog)];
}

The category:

@implementation NSObject (NoExceptionMessaging)

-(void)doesNotRecognizeSelector:(SEL)aSelector {
    NSLog(@"I've got them good ol' no exception blues.");
}
@end
CodaFi
  • 43,043
  • 8
  • 107
  • 153
  • Note, I'm reworking this with the modern runtime to get rid of that exception. I can simply replace the IMP, instead of swizzling. – CodaFi Jul 17 '12 at 21:29
  • 1
    No need to swizzle; if you're willing to tread down this dark path, you can do it at compile-time and just create a category on `NSObject` that overrides `doesNotRecognizeSelector:`. – jscs Jul 17 '12 at 21:33
  • 1
    @JoshCaswell Hiss! Categories should not be used to override methods!!!! I'll runtime this first, and use that as a last resort. – CodaFi Jul 17 '12 at 21:34
  • Eh, neither should methods be swizzled. Do you need to send a message to `super` (there isn't one) here? Are you doing something with the original implementation? Those are the reasons you avoid category overrides. – jscs Jul 17 '12 at 21:36
  • Work in progress, my friend. In fact, the category still throws an `EXC_I386_BPT` on OSX, so the implementations are just as dangerous. – CodaFi Jul 17 '12 at 21:38
  • The runtime is doing that, not the object -- `doesNotRecognizeSelector:` throws an exception because the object can't produce a valid return value. If you override it and _still_ don't give it anything, it dies. – jscs Jul 17 '12 at 21:47
  • Give it anything? Like what? The method returns void. – CodaFi Jul 17 '12 at 21:55
  • Never mind, I was making the assumption that it was still possible to recover from `doesNotRecognizeSelector:`, which [is not the case](http://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSObject_Class/Reference/Reference.html#//apple_ref/doc/uid/20000050-doesNotRecognizeSelector_): «this method must not return normally; it must always result in an exception being thrown.» In order to eat messages, you'll need to move further up the message resolution chain. – jscs Jul 17 '12 at 22:10
  • You're only swizzling doesNotRecognizeSelector--performSelector will still throw an exception, correct? http://pastebin.com/p3gF6fQf – Mike Sprague Jul 17 '12 at 22:12
3

For everyone's amusement, due to the discussion CodaFi and I were having, here's a quickly-hacked-together way to eat normally unresponded-to messages and have them return nil:

@interface EaterOfBadMessages : NSObject 
@end

@implementation EaterOfBadMessages

- (NSMethodSignature *)methodSignatureForSelector:(SEL)aSelector 
{
    NSMethodSignature * sig = [super methodSignatureForSelector:aSelector];
    if( !sig ){
        sig = [NSMethodSignature signatureWithObjCTypes:"@@:"];
    }
    return sig;
}

- (void)forwardInvocation:(NSInvocation *)anInvocation 
{
    id nilPtr = nil;
    [anInvocation setReturnValue:&nilPtr];
}

@end

int main(int argc, const char * argv[])
{

    @autoreleasepool {

        EaterOfBadMessages * e = [[EaterOfBadMessages alloc] init];
        // Of course, pre-ARC you could write [e chewOnThis]
        NSLog(@"-[EaterOfBadMessages chewOnThis]: %@", [e performSelector:@selector(chewOnThis)]);

    }
    return 0;
}

Please don't use this in real life.

jscs
  • 63,694
  • 13
  • 151
  • 195