Is Reading the 'Length' Property of an Array Really That Expensive an Operation in JavaScript

Is reading the `length` property of an array really that expensive an operation in JavaScript?

Well, I would have said it was expensive, but then I wrote a little test @ jsperf.com and to my surprise using i<array.length actually was faster in Chrome, and in FF(4) it didn't matter.

My suspicion is that length is stored as an integer (Uint32). From the ECMA-specs (262 ed. 5, page 121):

Every Array object has a
length property whose value is always a nonnegative integer less than 232. The value of the length property is
numerically greater than the name of
every property whose name is an array
index; whenever a property of an Array
object is created or changed, other
properties are adjusted as necessary
to maintain this invariant.
Specifically, whenever a property is
added whose name is an array index,
the length property is changed, if
necessary, to be one more than the
numeric value of that array index; and
whenever the length property is
changed, every property whose name is
an array index whose value is not
smaller than the new length is
automatically deleted. This constraint
applies only to own properties of an
Array object and is unaffected by
length or array index properties that
may be inherited from its prototypes

Phew! I don't know if I ever get used to such language ...

Finally, we always have our good old lagging behind browser. In IE (9, 8, 7) caching the length is really faster. One of many more reasons to not use IE, I say.

does caching array length for a loop condition affect performance?

In the first code, the length of the array(or array-like collection) is calculated only once and it it cached. So, the length is not recalculated per each iteration.

While in the second code, the length is calculated per iteration.

You can say that caching the length will be slightly faster than recalculating the length. This difference will be very very small that you can neglect this for smaller arrays. But for huge array, the difference could be significant.

Which way to use totally depend on the use case.
If the array length is updated inside the loop must use the second code.

for (var i = 0; i < data.length; i++) {
// Useful when the data length is altered in here
}

Does the .length property gets evaluated on every iteration of a for loop?

It does! I like this solution

for(var i=0, len=myArray.length; i<len; i++){
console.log(myArray[i]);
}

It's just a little cleaner than the one you pasted.

Javascript: Is the length method efficient?

All major interpreters provide efficient accessors for the lengths of native arrays, but not for array-like objects like NodeLists.

"Efficient looping in Javascript"

Test / Browser                Firefox 2.0 Opera 9.1   Internet Explorer 6
Native For-Loop 155 (ms) 121 (ms) 160 (ms)
...
Improved Native While-Loop 120 (ms) 100 (ms) 110 (ms)

"Efficient JavaScript code" suggests

for( var i = 0; i < document.getElementsByTagName('tr').length; i++ ) {
document.getElementsByTagName('tr')[i].className = 'newclass';
document.getElementsByTagName('tr')[i].style.color = 'red';
...
}


var rows = document.getElementsByTagName('tr');
for( var i = 0; i < rows.length; i++ ) {
rows[i].className = 'newclass';
rows[i].style.color = 'red';
...
}

Neither of these are efficient. getElementsByTagName returns a dynamic object, not a static array. Every time the loop condition is checked, Opera has to reassess the object, and work out how many elements it references, in order to work out the length property. This takes a little more time than checking against a static number.

In JavaScript, does the string/array .length property do any processing?

There are some situations where putting the length into a local variable is faster than accessing the .length property and it varies by browser. There have been performance discussions about this here on SO and numerous jsperf tests. In a modern browser, the differences were not as much as I thought they would be, but they do exist in some cases (I can't seem to find those previous threads).

There are also different types of objects that may have different performance characteristics. For example, a javascript array may have different performance characteristics than the array-like object returned from some DOM functions like getElementsByClassName().

And, there are some situations where you may be adding items to the end of the array and don't want to be iterating through the items you add so you get the length before you start.

Is the length of an array cached?

The array length is cached. It is updated each time the array is manipulated.


When you invoke the .push() method on the array, the array length is updated in step 6 of the algorithm:

Call the [[Put]] internal method of O with arguments "length", n, and true.

Source: http://es5.github.com/x15.4.html#x15.4.4.7


When you invoke the .pop() method of an array, the array length is updated in step 5.d of the algorithm:

Call the [[Put]] internal method of O with arguments "length", indx, and true.

Source: http://es5.github.com/x15.4.html#x15.4.4.6


When you assign a value to the array at an given index, the [[DefineOwnProperty]] internal method is invoked. The array length is updated in step 4.e.ii of the algorithm:

Call the default [[DefineOwnProperty]] internal method (8.12.9) on A passing "length", oldLenDesc, and false as arguments. This call will always return true.

Source: http://es5.github.com/x15.4.html#x15.4.5.1

Array vs. Object efficiency in JavaScript

The short version: Arrays are mostly faster than objects. But there is no 100% correct solution.

Update 2017 - Test and Results

var a1 = [{id: 29938, name: 'name1'}, {id: 32994, name: 'name1'}];

var a2 = [];
a2[29938] = {id: 29938, name: 'name1'};
a2[32994] = {id: 32994, name: 'name1'};

var o = {};
o['29938'] = {id: 29938, name: 'name1'};
o['32994'] = {id: 32994, name: 'name1'};

for (var f = 0; f < 2000; f++) {
var newNo = Math.floor(Math.random()*60000+10000);
if (!o[newNo.toString()]) o[newNo.toString()] = {id: newNo, name: 'test'};
if (!a2[newNo]) a2[newNo] = {id: newNo, name: 'test' };
a1.push({id: newNo, name: 'test'});
}

test setup
test results

Original Post - Explanation

There are some misconceptions in your question.

There are no associative arrays in Javascript. Only Arrays and Objects.

These are arrays:

var a1 = [1, 2, 3];
var a2 = ["a", "b", "c"];
var a3 = [];
a3[0] = "a";
a3[1] = "b";
a3[2] = "c";

This is an array, too:

var a3 = [];
a3[29938] = "a";
a3[32994] = "b";

It's basically an array with holes in it, because every array does have continous indexing. It's slower than arrays without holes. But iterating manually through the array is even slower (mostly).

This is an object:

var a3 = {};
a3[29938] = "a";
a3[32994] = "b";

Here is a performance test of three possibilities:

Lookup Array vs Holey Array vs Object Performance Test

An excellent read about these topics at Smashing Magazine: Writing fast memory efficient JavaScript

Do loops check the array.length every time when comparing i against array.length?

A loop consisting of three parts is executed as follows:

for (A; B; C)

A - Executed before the enumeration
B - condition to test
C - expression after each enumeration (so, not if B evaluated to false)

So, yes: The .length property of an array is checked at each enumeration if it's constructed as for(var i=0; i<array.length; i++). For micro-optimisation, it's efficient to store the length of an array in a temporary variable (see also: What's the fastest way to loop through an array in JavaScript?).

Equivalent to for (var i=0; i<array.length; i++) { ... }:

var i = 0;
while (i < array.length) {
...
i++;
}

Javascript Set vs. Array performance

Ok, I have tested adding, iterating and removing elements from both an array and a set. I ran a "small" test, using 10 000 elements and a "big" test, using 100 000 elements. Here are the results.

Adding elements to a collection

It would seem that the .push array method is about 4 times faster than the .add set method, no matter the number of elements being added.

Iterating over and modifying elements in a collection

For this part of the test I used a for loop to iterate over the array and a for of loop to iterate over the set. Again, iterating over the array was faster. This time it would seem that it is exponentially so as it took twice as long during the "small" tests and almost four times longer during the "big" tests.

Removing elements from a collection

Now this is where it gets interesting. I used a combination of a for loop and .splice to remove some elements from the array and I used for of and .delete to remove some elements from the set. For the "small" tests, it was about three times faster to remove items from the set (2.6 ms vs 7.1 ms) but things changed drastically for the "big" test where it took 1955.1 ms to remove items from the array while it only took 83.6 ms to remove them from the set, 23 times faster.

Conclusions

At 10k elements, both tests ran comparable times (array: 16.6 ms, set: 20.7 ms) but when dealing with 100k elements, the set was the clear winner (array: 1974.8 ms, set: 83.6 ms) but only because of the removing operation. Otherwise the array was faster. I couldn't say exactly why that is.

I played around with some hybrid scenarios where an array was created and populated and then converted into a set where some elements would be removed, the set would then be reconverted into an array. Although doing this will give much better performance than removing elements in the array, the additional processing time needed to transfer to and from a set outweighs the gains of populating an array instead of a set. In the end, it is faster to only deal with a set. Still, it is an interesting idea, that if one chooses to use an array as a data collection for some big data that doesn't have duplicates, it could be advantageous performance wise, if there is ever a need to remove many elements in one operation, to convert the array to a set, perform the removal operation, and convert the set back to an array.

Array code:

var timer = function(name) {  var start = new Date();  return {    stop: function() {      var end = new Date();      var time = end.getTime() - start.getTime();      console.log('Timer:', name, 'finished in', time, 'ms');    }  }};
var getRandom = function(min, max) { return Math.random() * (max - min) + min;};
var lastNames = ['SMITH', 'JOHNSON', 'WILLIAMS', 'JONES', 'BROWN', 'DAVIS', 'MILLER', 'WILSON', 'MOORE', 'TAYLOR', 'ANDERSON', 'THOMAS'];
var genLastName = function() { var index = Math.round(getRandom(0, lastNames.length - 1)); return lastNames[index];};
var sex = ["Male", "Female"];
var genSex = function() { var index = Math.round(getRandom(0, sex.length - 1)); return sex[index];};
var Person = function() { this.name = genLastName(); this.age = Math.round(getRandom(0, 100)) this.sex = "Male"};
var genPersons = function() { for (var i = 0; i < 100000; i++) personArray.push(new Person());};
var changeSex = function() { for (var i = 0; i < personArray.length; i++) { personArray[i].sex = genSex(); }};
var deleteMale = function() { for (var i = 0; i < personArray.length; i++) { if (personArray[i].sex === "Male") { personArray.splice(i, 1) i-- } }};
var t = timer("Array");
var personArray = [];
genPersons();
changeSex();
deleteMale();
t.stop();
console.log("Done! There are " + personArray.length + " persons.")


Related Topics



Leave a reply



Submit