Why Is Mutating the [[Prototype]] of an Object Bad for Performance

Why is mutating the [[prototype]] of an object bad for performance?

// This is bad: 
//foo.__proto__.bar = bar;

// But this is okay
Foo.prototype.bar = bar;

No. Both are doing the same thing (as foo.__proto__ === Foo.prototype), and both are fine. They're just creating a bar property on the Object.getPrototypeOf(foo) object.

What the statement refers to is assigning to the __proto__ property itself:

function Employee() {}
var fred = new Employee();

// Assign a new object to __proto__
fred.__proto__ = Object.prototype;
// Or equally:
Object.setPrototypeOf(fred, Object.prototype);

The warning at the Object.prototype page goes into more detail:

Mutating the [[Prototype]] of an object is, by the nature of how modern JavaScript engines optimize property accesses, a very slow operation

They simply state that changing the prototype chain of an already existing object kills optimisations. Instead, you're supposed to create a new object with a different prototype chain via Object.create().

I couldn't find an explicit reference, but if we consider how V8's hidden classes were implemented (and the more recent write-up), we can see what might go on here. When changing the prototype chain of an object, its internal type changes - it does not simply become a subclass like when adding a property, but is completely swapped. It means that all property lookup optimisations are flushed, and precompiled code will need to be discarded. Or it simply falls back to non-optimized code.

Some notable quotes:

  • Brendan Eich (you know him) said

    Writable _proto_ is a giant pain to implement (must serialize to cycle-check) and it creates all sorts of type-confusion hazards.

  • Brian Hackett (Mozilla) said:

    Allowing scripts to mutate the prototype of pretty much any object makes it harder to reason about the behavior of a script and makes VM, JIT, and analysis implementation more complex and buggier. Type inference has had several bugs due to mutable _proto_ and cannot maintain several desirable invariants because of this feature (i.e. 'type sets contain all the possible type objects which can realized for a var/property' and 'JSFunctions have types which are also functions').

  • Jeff Walden said:

    Prototype mutation after creation, with its erratic performance destabilization, and the impact upon proxies and [[SetInheritance]]

  • Erik Corry (Google) said:

    I don't expect big performance gains from making proto non-overwritable. In non-optimized code you have to check the prototype chain in case the prototype objects (not their identity) have been changed. In the case of optimized code you can fall back to nonoptimized code if someone writes to proto. So it wouldn't make all that much difference, at least in V8-Crankshaft.

  • Eric Faust (Mozilla) said

    When you set _proto_, not only are you ruining any chances you may have had for future optimizations from Ion on that object, but you also force the engine to go crawling around to all the other pieces of type inference (information about function return values, or property values, perhaps) which think they know about this object and tell them not to make many assumptions either, which involves further deoptimization and perhaps invalidation of existing jitcode.

    Changing the prototype of an object in the middle of execution is really a nasty sledgehammer, and the only way we have to keep from being wrong is to play it safe, but safe is slow.

JavaScript Why manipulating __proto__ is slow?

I was wondering why mutating instances' proto is a slow manipulation.

The people who implemented the JavaScript language in your browser made a trade-off: They wanted to support this "esoteric" feature, but made the rest of the language faster by making this manipulation slower.

You should only worry about the speed of __proto__ after you write your program. For many use cases, the extra "slowness" will only result in a few miliseconds difference in the overall program, and nobody will care.

Warning of mutating the [[prototype]] of an object in d3.js?

Does anyone have the idea why is this happening

Looks like it's d3's fault. They seem to use it to subclass arrays here:

// Until ECMAScript supports array subclassing, prototype injection works well.
var d3_subclass = function(object, prototype) {
object.__proto__ = prototype;
};

how can I solve this?

Ignore the warning. Or file a bug against have a look at this issue of d3.js.

Whats the performance impact of setPrototypeOf on a new Object?

If you fear (as apparently you should..) the performance impact of using Object.setPrototypeOf(), but want to keep your object creation syntax similar to how your code is structured, try this:

var MyPrototype = {
method1 : function(){...},
method2 : function(){...},
...
};

var newObject = Object.assign(Object.create(MyPrototype), {
property : 1,
property2 : 'text'
});

Will a JavaScript environment eventually recover after changing the [[Prototype]] of an object?

V8 developer here. This question does not have a simple answer.

Most optimizations will "come back" (at the cost of spending additional CPU time, of course). For example, optimized code that had to be thrown away will eventually get recompiled.

Some optimizations will remain disabled forever. For example, V8 skips certain checks when (and as long as) it knows that prototype chains have not been mucked with. If it sees an app modify prototype chains, it plays it safe from then on.

To make things even more complicated, the details can and will change over time. (Which is why there's not much point in listing more specific circumstances here, sorry.)

Background:

There are many places in JavaScript where code might do a certain thing, which the JavaScript engine must check for, but most code doesn't do it. (Take, for example, inheriting missing elements from an array's prototype: ['a', ,'c'][1] almost always returns undefined, except if someone did Array.prototype[1] = 'b' or Object.prototype[1] = 'b'.) So when generating optimized code for a function, the engine has to decide between two options:

(A) Always check for the thing in question (in the example: walk the array's prototype chain and check every prototype to see if it has an element at that index). Let's say executing this code will take 2 time units.

(B) Optimistically assume that array prototypes have no elements, and skip the check (in the example: don't even look at prototypes, just return undefined). Let's say this brings execution time down to 1 time unit (twice as fast, yay!). However, in order to be correct, the engine must now keep a close eye on the prototype chains of all arrays, and if any elements show up anywhere, all code based on this assumption must be found and thrown away, at a cost of 1000 time units.

Given this tradeoff, it makes sense that the engine at first follows the fast-but-risky strategy (B), but when that fails even just once, it switches to the safer strategy (A), in order to avoid the risk of having to pay the 1000-time-unit penalty again.

You can argue whether "even just once" is the best threshold, or whether a site should get 2, 3, or even more free passes before giving up on (B), but that doesn't change the fundamental tradeoff.

what's the point of creating prototype chain when we've already had Object.prototype.use(object) in JS?

This is because your constructors are adding all the properties to this, you're not using the prototypes.

Normally, methods are added to the prototype, not each instance, e.g.

function Person(firstName, lastName, age) {
this.firstName = firstName;
this.lastName = lastName;
this.age = age;
}

Person.prototype.getFullName = function() {
console.log(this.firstName + this.lastName)
}

If you don't create the prototype chain, Student won't inherit a method defined this way.

Why is Object.setPrototypeOf discouraged/inefficient when constructing Javascript class hierarchies?

Youre creating two instances, and then set the one to be the prototype of the other:

var myAnimal=makeAnimal(),myDog=makeDog();
Object.setPrototypeOf(myDog, myAnimal);

So instead of this easy inheritance we basically want:

myDog -> Dog.prototype -> Animal.prototype
myDog2 ->
myDog3 ->

Animal ->
Animal2 ->

Youre doing this:

myDog -> myAnimal
myDog1 -> myAnimal1
myDog2 -> myAnimal2
Animal
Animal2

So instead of two prototypes holding all the functions and lightweight instances just holding the data, you have 2n (one animal for each dog) instances holding bound function references and the data.
thats really not efficient when constructing many elems, and assigning functions in a factory isnt either, so may stick to the class inheritance as it resolves both problems. Or if you want to use setPrototype use it once ( then its slowlyness hasnt a big effect):

var Animalproto = {
birthday(){...}
}

var Dogproto={
bark(){ ... }
}

Object.setPrototypeOf(Dogproto,Animalproto);

function makeAnimal(){
return Object.create(Animalproto);
}

function makeDog(){
return Object.create(Dogproto);
}


Related Topics



Leave a reply



Submit