In the current software I am working on, I'm seeing this pattern quite extensively and it seems that they're trying to avoid return statements as much as possible. Even at some places in the code this is taken almost to the extreme.
I've somehow heard a long time ago that return statements should be minimized or avoided but I can't remember (or search about) the exact reason or origin of this line of thought. I believe it was due to some performance implication in some PL.
I do not personally adhere to this since it doesn't make code that readable but, giving it the benefit of the doubt, I'm curious whether using this pattern has its merits in javascript in terms of performance.
if (err) {
if (err.message) {
message = err.message;
} else {
if (err.errorCode) {
message = err.errorCode;
} else {
message = "Unknown error";
}
}
} else {
message = "Unknown error.";
}
deferred.reject(message);
If I were to decide, I'd casually use the return statement to terminate sequence like this:
if(!err || (!err.message && !err.errorCode)){
deferred.reject("Unknown error.");
return;
}
if(err.message){
deferred.reject(err.message);
return;
}
if(err.errorCode){
deferred.reject(err.errorCode);
}
Are there advantages in the first pattern over the second one?