So I understand that given an array, you can sort it using a custom compare function.
So something like the following in Javascript:
var arr = [5,4,3,6,7,2];
arr.sort(function(a,b){
if (a < b)
return -1;
else if (a > b)
return 1;
else
return 0;
});
So my friend was saying that I don't need to return 0 to sort the list for this scenario. Furthermore, he says that we can return from [true,false]
instead of returning from [-1,0,1]
. Is that true?
I was trying to find a counterexample to his claim, but I could not. I can't imagine a case where the array isn't sorted properly using his code.
Here's the example that my friend gave:
var arr = [5, 4, 3, 6, 7, 2];
arr.sort(function(a, b) {
return a > b;
});
Is it good practice to return from a range of [-1,0,1]
? What is the necessity when the integers are comparable? I've noticed that this is the case across a multitude of programming languages and not just javascript. Like this example in C.