9

So, I've read the MDN disclaimers and warnings, I've read a great answer on the subject, but there's still something I want to know. This question actually came from an answer I gave to another question, here.

Let's say I decide to do the dirty deed. Something that I will regret for the rest of my life. Something that will stain me with shame forever and dishonor my family name. A purposeful, deliberate ending of --

Alright, enough of that. Anyway, here it is:

let proto = Object.getPrototypeOf(Function.prototype);

Object.setPrototypeOf(Function.prototype, {
  iBetterHaveAGoodReasonForDoingThis : "Bacon!"
});

//just to prove it actually worked
let f = (function(){});
console.log(f.iBetterHaveAGoodReasonForDoingThis);

// Quick, hide the evidence!!
Object.setPrototypeOf(Function.prototype, proto);

Basically, what I did there, was change the prototype of Function.prototype, an object that impacts pretty much every piece of JavaScript code you could write. Then I changed it back.

I wanted to illustrate a big change in the prototype chain that would impact a lot of code and cause a lot of optimizations to go down the drain. I don't expect changing it back would fix anything (if anything, I expect it would make things worse performance-wise). I'd love to know if it would or wouldn't, but if it does, that wasn't my intention.

I just want to know if, after a change like this, will the JavaScript environment begin to recover and starting optimizing things again? Or will it just give up forever and run everything in deoptimized mode? Are there optimizations that will never be achieved because of this? Can I trust that, eventually, after a period of recovery, it will return to its regular state?

For context, I'm talking about engines like the most recent version of V8, not the primitive crap used by stuff like Internet Explorers. I understand the answer could be different in different systems, but I hope there is some commonality among them.

GregRos
  • 8,667
  • 3
  • 37
  • 63
  • 1
    This is actually one of the questions that is in my mind since a long time... i think the V8 team definetly needs to share more of their optimization effort, its somehow like a blackbox (an awesome fast blackbox) ... – Jonas Wilms Feb 14 '18 at 20:30
  • 2
    Hey, bacon *is* always a good reason! – Bergi Feb 14 '18 at 20:43
  • 1
    I'm hoping for an expert answer by [jmrk](https://stackoverflow.com/users/6036428/jmrk), but I'm pretty sure that the optimiser never gives up and eventually everything will recover. – Bergi Feb 14 '18 at 20:47

1 Answers1

9

V8 developer here. This question does not have a simple answer.

Most optimizations will "come back" (at the cost of spending additional CPU time, of course). For example, optimized code that had to be thrown away will eventually get recompiled.

Some optimizations will remain disabled forever. For example, V8 skips certain checks when (and as long as) it knows that prototype chains have not been mucked with. If it sees an app modify prototype chains, it plays it safe from then on.

To make things even more complicated, the details can and will change over time. (Which is why there's not much point in listing more specific circumstances here, sorry.)

Background:

There are many places in JavaScript where code might do a certain thing, which the JavaScript engine must check for, but most code doesn't do it. (Take, for example, inheriting missing elements from an array's prototype: ['a', ,'c'][1] almost always returns undefined, except if someone did Array.prototype[1] = 'b' or Object.prototype[1] = 'b'.) So when generating optimized code for a function, the engine has to decide between two options:

(A) Always check for the thing in question (in the example: walk the array's prototype chain and check every prototype to see if it has an element at that index). Let's say executing this code will take 2 time units.

(B) Optimistically assume that array prototypes have no elements, and skip the check (in the example: don't even look at prototypes, just return undefined). Let's say this brings execution time down to 1 time unit (twice as fast, yay!). However, in order to be correct, the engine must now keep a close eye on the prototype chains of all arrays, and if any elements show up anywhere, all code based on this assumption must be found and thrown away, at a cost of 1000 time units.

Given this tradeoff, it makes sense that the engine at first follows the fast-but-risky strategy (B), but when that fails even just once, it switches to the safer strategy (A), in order to avoid the risk of having to pay the 1000-time-unit penalty again.

You can argue whether "even just once" is the best threshold, or whether a site should get 2, 3, or even more free passes before giving up on (B), but that doesn't change the fundamental tradeoff.

jmrk
  • 34,271
  • 7
  • 59
  • 74
  • Great answer! Thank you. Is there any written material that lists those optimizations, which you don't mention? – GregRos Feb 14 '18 at 23:59
  • I'm only aware of the source code; and as I said, it keeps changing as the team works on it. – jmrk Feb 15 '18 at 00:21
  • I guess array indices are a special case, but doesn't every optimised (named) property access do a hidden class check? (I'll assume that these checks are what needs to be invalidated when the prototype object of a hidden class is mutated) – Bergi Feb 15 '18 at 00:41
  • 1
    Also I'm curious about `Object.setPrototypeOf` specifically. What assumptions does it actually break that lead to code being thrown away? Sure, when I call it on a prototype object then that counts as a mutation, but when I call it upon a fresh instance shouldn't it just change the hidden class of that particular object? – Bergi Feb 15 '18 at 00:47
  • @Bergi I think it does no harm to change the prototype of a newly created object, since it hasn't been used in any code so no code has been optimized for it. I guess at some point the object is marked "optimized", and when it is, then changing the prototype would do nasty stuff. – GregRos Feb 15 '18 at 14:14
  • 1
    Yes, all property accesses do hidden class checks, and since changing prototypes changes hidden classes, those checks will fail afterwards. There are additional mechanisms, the "has anyone ever modified the `Array` prototype chain" bit I mentioned in my answer is one of them. -- Yes, changing the prototype of a recently created object has less impact than doing it later. Objects don't get optimized, code does; optimized code can have baked-in assumptions/dependencies about various objects' shape. – jmrk Feb 15 '18 at 17:49