Howdy,
I currently develop the javascript toolkit qooxdoo
( http://qooxdoo.sourceforge.net), some of you heard it already.
We have discovered a slowdown on Internet Explorers performance when
creating objects with some data and store them in a global object
registry. It take some time to get this example extracted from our
codebase. The attached file (take a look at it please) shows the exact
problem. The time for each object increases per loop. In my tests I get
the following values from the alert at the end:
Results: 30, 41, 60, 80, 100, 120, 150, 160, 190, 211
As you can see the time for the last run is seven times higher than the
time the first loop takes. In other browsers like Firefox there are some
outlier but there is no noticeable increase in the measured values.
This seems not to be something like a memory leak discussed often. This
is while loading one document and excuting the scripts inside. It is the
same after each reloead.
Has someone out there the same problem and possible a solution for this
misbehaviour of Internet Explorer? Thanks in advance.
Regards,
Sebastian Werner 8 1936
> We have discovered a slowdown on Internet Explorers performance when creating objects with some data and store them in a global object registry.
You were brought here by the destiny: because I was just looking for a
long time a reliable hash mechanics experiment! :-)
You have wrongly positionned you problem though. It has nothing to do
with storing objects in objects. It is connected with two issues:
1) are you assigning new value or are you overwriting an existing value
2) are you using array mechanics or hash mechanics.
---------------------
Test 1-A
Random value assignment/overwriting.
Hash mechanics is used.
<html>
<head><title>Test 1-A</title></head>
<body>
<script type="text/javascript">
var objectInstance = {};
var startTime = null;
var measuredTimes=[];
var tmp = "";
for (var i=0; i<10; i++) {
startTime = (new Date).valueOf();
for (var j=0; j<2500; j++) {
tmp = "h" + Math.floor(100000*Math.random());
objectInstance[tmp] = "";
};
measuredTimes.push((new Date).valueOf()-startTime);
}
alert("Results: " + measuredTimes);
</script>
</body>
</html>
Results (average of 3 runs):
Firefox: 31, 78, 16, 62, 47, 31, 47,31, 47, 63
Explorer: 16, 31, 47, 78, 94, 125, 140, 157, 171
Opera(!): 16, 47, 94, 125, 422, 640, 1203, 1328, 1657, 1390
---------------------
Test 1-B
Random value assignment/overwriting.
Array mechanics is used.
<html>
<head><title>Test 1-B</title></head>
<body>
<script type="text/javascript">
var arrayInstance = [];
var startTime = null;
var measuredTimes=[];
var tmp = "";
for (var i=0; i<10; i++) {
startTime = (new Date).valueOf();
for (var j=0; j<2500; j++) {
tmp = Math.floor(100000*Math.random(j));
arrayInstance[tmp] = "";
};
measuredTimes.push((new Date).valueOf()-startTime);
}
alert("Results: " + measuredTimes);
</script>
</body>
</html>
Results (average of 3 runs):
Firefox: 15, 16, 0, 16, 93, 16, 16, 31, 31, 16
Explorer (stabilized!) : 16, 16, 15, 32, 31, 31, 31, 47, 47, 47, 47
Opera(!): 47, 141, 312, 484, 813, 828, 906, 985, 1031, 1078
---------------------
Test 2-A
Consecutive assignment
Hash mechanics is used.
<html>
<head><title>Test 2-A</title></head>
<body>
<script type="text/javascript">
var objectInstance = new Object();
var startTime = null;
var measuredTimes=[];
var tmp = "";
for (var i=0; i<10; i++) {
startTime = (new Date).valueOf();
for (var j=0; j<2500; j++) {
objectInstance["h"+j+100000] = "";
};
measuredTimes.push((new Date).valueOf()-startTime);
}
alert("Results: " + measuredTimes);
</script>
</body>
</html>
Results (average of 3 runs):
Firefox: 15, 32, 31, 31, 31, 16, 47, 15, 47, 63
Explorer: 31, 15, 16, 16, 31, 16, 15, 16, 15, 32
Opera (finally!): 47, 15, 32, 15, 31, 16, 16, 31, 16, 15
---------------------
Test 2-B
Consecutive assignment
Array mechanics is used.
<html>
<head><title>Test 2-B</title></head>
<body>
<script type="text/javascript">
var arrayInstance = new Array();
var startTime = null;
var measuredTimes=[];
var tmp = "";
for (var i=0; i<10; i++) {
startTime = (new Date).valueOf();
for (var j=0; j<2500; j++) {
arrayInstance[j+100000] = "";
};
measuredTimes.push((new Date).valueOf()-startTime);
}
alert("Results: " + measuredTimes);
</script>
</body>
</html>
Results (average of 3 runs):
Firefox: 15, 0,0, 16,0, 0,0,15,0,0
Explorer: 0, 0, 15, 0, 0, 16, 0,0,0,15
Opera: 16, 0, 0, 0, 15, 0,0,0,0,16
So:
1) Firefox is cool.
2) Array is the tool of choice for data manipulating
(rellocation/reassignment/deletion etc.) This is what is done for by
sacrificing a bit the lookup operations (reading from array is normally
a bit more slow than reading
key value from hash
3) Hash has a great for lookup. All its mechanics is build to get the
key value as quick as possible. Update operations are the vistims of
it.
>>We have discovered a slowdown on Internet Explorers performance when creating objects with some data and store them in a global object registry.
You were brought here by the destiny: because I was just looking for a long time a reliable hash mechanics experiment! :-)
Hi VK, You have wrongly positionned you problem though. It has nothing to do with storing objects in objects. It is connected with two issues: 1) are you assigning new value or are you overwriting an existing value 2) are you using array mechanics or hash mechanics.
Thanks a lot for your quick answer. Unfortunately, you did not solve our
problem as it *really* seems to be a problem to store objects inside
arrays (or other objects). You seem to have overseen our global registry
for objects (see the push into an array in our code). This is the
problem. The combination of objects with a lot of data and to store them
afterwards inside a global array.
We do not understand why storing simple object references in another
(complex) object (array, hash) is a problem for Internet Explorer.
Maybe our example oversimplified the problem. Our objects are meant to
be instances of javascript "classes" (that is "constructors). Therefore,
the have to be created by a "new" keyword.
Any ideas?
Sebastian --------------------- Test 1-A Random value assignment/overwriting. Hash mechanics is used.
<html> <head><title>Test 1-A</title></head> <body> <script type="text/javascript"> var objectInstance = {}; var startTime = null; var measuredTimes=[]; var tmp = "";
for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { tmp = "h" + Math.floor(100000*Math.random()); objectInstance[tmp] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); } alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs): Firefox: 31, 78, 16, 62, 47, 31, 47,31, 47, 63 Explorer: 16, 31, 47, 78, 94, 125, 140, 157, 171 Opera(!): 16, 47, 94, 125, 422, 640, 1203, 1328, 1657, 1390
--------------------- Test 1-B Random value assignment/overwriting. Array mechanics is used.
<html> <head><title>Test 1-B</title></head> <body> <script type="text/javascript"> var arrayInstance = []; var startTime = null; var measuredTimes=[]; var tmp = "";
for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { tmp = Math.floor(100000*Math.random(j)); arrayInstance[tmp] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); } alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs): Firefox: 15, 16, 0, 16, 93, 16, 16, 31, 31, 16 Explorer (stabilized!) : 16, 16, 15, 32, 31, 31, 31, 47, 47, 47, 47 Opera(!): 47, 141, 312, 484, 813, 828, 906, 985, 1031, 1078
--------------------- Test 2-A Consecutive assignment Hash mechanics is used.
<html> <head><title>Test 2-A</title></head>
<body> <script type="text/javascript"> var objectInstance = new Object(); var startTime = null; var measuredTimes=[]; var tmp = "";
for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { objectInstance["h"+j+100000] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); } alert("Results: " + measuredTimes); </script>
</body> </html>
Results (average of 3 runs): Firefox: 15, 32, 31, 31, 31, 16, 47, 15, 47, 63 Explorer: 31, 15, 16, 16, 31, 16, 15, 16, 15, 32 Opera (finally!): 47, 15, 32, 15, 31, 16, 16, 31, 16, 15
--------------------- Test 2-B Consecutive assignment Array mechanics is used.
<html> <head><title>Test 2-B</title></head>
<body> <script type="text/javascript"> var arrayInstance = new Array(); var startTime = null; var measuredTimes=[]; var tmp = "";
for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { arrayInstance[j+100000] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); } alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs): Firefox: 15, 0,0, 16,0, 0,0,15,0,0 Explorer: 0, 0, 15, 0, 0, 16, 0,0,0,15 Opera: 16, 0, 0, 0, 15, 0,0,0,0,16
So: 1) Firefox is cool. 2) Array is the tool of choice for data manipulating (rellocation/reassignment/deletion etc.) This is what is done for by sacrificing a bit the lookup operations (reading from array is normally a bit more slow than reading key value from hash 3) Hash has a great for lookup. All its mechanics is build to get the key value as quick as possible. Update operations are the vistims of it.
> Thanks a lot for your quick answer. Unfortunately, you did not solve our problem as it *really* seems to be a problem to store objects inside arrays (or other objects). You seem to have overseen our global registry for objects (see the push into an array in our code). This is the problem. The combination of objects with a lot of data and to store them afterwards inside a global array.
I did not overseen your problem, I just put it on the real "minimum
code" level. As you may see, I removed your ObjectRegistry, and it did
not affect anyhow on the pattern and the proportions. So ObjectRegistry
(and putting something into it) is not a problem.
The problem is that Hash table (Dictionary, Map table, Enumerator,
Associative array) is a bad place for filling by data sets in a stream
as well as frequent data update/deletion. As Object based on the hash
mechanics, it's a bad place too for the same set of operations. But
nobody can beat hash for lookup (retrieving key value aka property
value or methos code). This is why it's a structure of choice for
objects.
And my test cases (if you look at them) show how you may help help your
problem: use Array to keep your object references.
And nothing will help for Opera that simply sucks on it.
And Firefox is speedy with both filling Hash table or Array (still
always a bit more slow on Hash table as you can see). That could be
explained by a good code structure, small program footpring and, sorry
to say, semi primitivity of Firefox (who knows how many additional
checks and updates Explorer is forced to do to pay its price to be an
"OS browser").
We do not understand why storing simple object references in another (complex) object (array, hash) is a problem for Internet Explorer.
Not *storing* - *puting into* (assigning), that's the key.
Maybe our example oversimplified the problem. Our objects are meant to be instances of javascript "classes" (that is "constructors). Therefore, the have to be created by a "new" keyword.
Doesn't matter what are you puting: primitives, objects or flower
boxes. The only important issues are:
1) are you puting it consecutively on empty places or randomly with
previous values overwriting.
2) are you using Array or Dictionary (Hash).
I have tried out VK's examples whilst retaining the object registry.
I am getting similar results to VK.
The problem appears to lie as VK's suggests, not with ObjectRegistry,
but with how efficiently IE handles the creation of "alphanumeric
keys".
If you drop out the ObjectRegistry and just create a single instance
with 25,000 keys instead of 2,500 (i.e. x10), you get similar slow
times (i.e. 797 - which equates to adding up all of your values).
var objectInstance, startTime, measuredTimes=[];
startTime = (new Date).valueOf();
objectInstance = new Object;
for (var j=0; j<25000; j++) {
objectInstance["h" + Math.random()] = "";
};
measuredTimes.push((new Date).valueOf()-startTime);
alert("Results: " + measuredTimes);
VK wrote:
<snip> You were brought here by the destiny: because I was just looking for a long time a reliable hash mechanics experiment! :-)
You have wrongly positionned you problem though. It has nothing to do with storing objects in objects. It is connected with two issues: 1) are you assigning new value or are you overwriting an existing value 2) are you using array mechanics or hash mechanics.
--------------------- Test 1-A Random value assignment/overwriting. Hash mechanics is used.
<html> <head><title>Test 1-A</title></head> <body> <script type="text/javascript"> var objectInstance = {}; var startTime = null; var measuredTimes=[]; var tmp = "";
for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { tmp = "h" + Math.floor(100000*Math.random()); objectInstance[tmp] = ""; }; measuredTimes.push((new Date).valueOf()-startTime);
Included in the operations being timed here are:-
1. Assignment to a property of the Variable/Activation
object of a string value. (* 2500)
2. String concatenation. (* 2500)
3. A call to the Math.floor method (* 2500)
4. A call to the Math.random method (* 2500)
5. Multiplication (* 2500)
6. Number to string type-conversion. (* 2500)
7. The initiation speed of a Date object (* 1)
8. A call to the valueOf method of that date object (* 1)
9. Subtraction. (* 1)
- along with the assignment of a string value to a named property of an
object.
If a comparison is to be made between the performance of one action in
various environments it becomes important to take into account (or
compensate for) any differences in performance that may be exhibited by
all of the operations that are not intended to be the subject of the
experiment.
For example, string concatenation is known to be an area where IE
performs particularly badly, and Mozilla/Gecko particularly well. Opera
is known to take an age to instantiate Date objects, and so on.
} alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs): Firefox: 31, 78, 16, 62, 47, 31, 47,31, 47, 63 Explorer: 16, 31, 47, 78, 94, 125, 140, 157, 171 Opera(!): 16, 47, 94, 125, 422, 640, 1203, 1328, 1657, 1390
The Date object is known to have a limited precision, suggested to be
never better than 10 milliseconds, and the OS will only 'tick' time at a
particular rate, suggested to commonly be in the range of 56 to 10
milliseconds. The practical upshot of this is that when results of
timings are deduced from total periods of less than a couple of hundred
milliseconds the resulting 'data' may be entirely attributed to the
limitations of the measuring system.
<snip> for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { tmp = Math.floor(100000*Math.random(j)); arrayInstance[tmp] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); }
Included in the operations being timed here are:-
1. Assignment to a property of the Variable/Activation
object of a numeric value. (* 2500)
2. A call to the Math.floor method (* 2500)
3. A call to the Math.random method (* 2500)
4. Multiplication (* 2500)
5. Number to string type-conversion. (* 2500, but not implicit in the
property accessor)
6. The initiation speed of a Date object (* 1)
7. A call to the valueOf method of that date object (* 1)
8. Subtraction. (* 1)
- fewer operations and some slightly different operations. So deductions
about the proposed subject of the 'test' derived from results that
differ from the previous tests do not necessarily apply to that subject
at all.
<snip> for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { objectInstance["h"+j+100000] = ""; }; measuredTimes.push((new Date).valueOf()-startTime);
Additional operations being timed here are:-
1. String concatenation (* 5000)
2. The initiation speed of a Date object (* 1)
3. A call to the valueOf method of that date object (* 1)
4. Subtraction. (* 1)
- fewer again.
<snip> for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { arrayInstance[j+100000] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); }
And here:-
1. String concatenation (* 2500)
2. The initiation speed of a Date object (* 1)
3. A call to the valueOf method of that date object (* 1)
4. Subtraction. (* 1)
- with no less than half the string concatenation of the previous
'test'.
alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs):
It seems unlikely that an average of three runs would produced uniform
integer results from three integer values. Except as a result of
imprecision in the measurement mechanism that exceeded the values being
measured.
Firefox: 15, 0,0, 16,0, 0,0,15,0,0 Explorer: 0, 0, 15, 0, 0, 16, 0,0,0,15 Opera: 16, 0, 0, 0, 15, 0,0,0,0,16
It should also be obvious that when 50% of results assert that 2500
loops of operations apparently take zero time (which is impossible) then
the precision of the measurements must be less than the values being
measured.
But longer loops would not help much when comparisons are to be made
between sets of operations that differ by more than just the supposed
subject of the test. So: 1) Firefox is cool.
But why? Is it cool because it does string concatenation considerably
faster than its competitors, runs Math.random faster, or because it
implements its objects as some sort of HashMap instead of some sort of
List (or maybe it has a lot of implementation optimisations that the
test code can take advantage of)?
2) Array is the tool of choice for data manipulating (rellocation/reassignment/deletion etc.) This is what is done for by sacrificing a bit the lookup operations (reading from array is normally a bit more slow than reading key value from hash
Even with a 'testing' strategy that tells you nothing about its subject
you cannot make deductions from the above about the performance of
look-ups on Objects/Arrays as those operations do not explicitly appear
in the 'tests' at all.
Deductions about the relative behaviour of Arrays verses Objects should
not be based on tests that are not equivalent in what is being measured.
3) Hash has a great for lookup.
As per previous comments on lookups.
All its mechanics is build to get the key value as quick as possible. Update operations are the vistims of it.
Lies, damned lies and statistics. You have started with flawed
preconceptions and created 'tests' that will tell you nothing useful.
Consider recent discussions of the implementation of native ECMAScript
objects. The implementation details are left to the implementers and
they may make any decisions they see fit so long as the resulting
behaviour corresponds with the language specification. The language
specification does not talk of hashing at all, so any object
implementation may or may not be built upon a HashMap style of object.
If it is, read/write performance in ECMAScript objects will have
characteristics that resemble HashMap characteristics (be to some extent
uniform regardless of the number of properties defined), and if an
object is implemented as a list the read/write characteristics of that
object's properties will depend on the number of properties and the
location of the property in question in that list.
Well devised tests might say something about the underlying
implementation, and even compare it between environments and with those
objects that have been augmented into being Arrays (and so expose
possible internal optimisations related to 'array index' property
names), but instead you have expended time learning nothing.
Richard.
Sebastian Werner wrote:
<snip> .... The attached file ...
<snip> As you can see ...
<snip>
comp.lang.javascript is a plain text only newsgroup. Many news servers
appreciate that and do not accept, store or propagate attachments. Thus,
whatever point this attachment was intended to make probably remains
unknown to a significant proportion of the readers of this group, so
informed responses are less likely. See the group's FAQ:-
<URL: http://www.jibbering.com/faq >
Richard.
Richard Cornford wrote: VK wrote: <snip> You were brought here by the destiny: because I was just looking for a long time a reliable hash mechanics experiment! :-) You have wrongly positionned you problem though. It has nothing to do with storing objects in objects. It is connected with two issues: 1) are you assigning new value or are you overwriting an existing value 2) are you using array mechanics or hash mechanics.
--------------------- Test 1-A Random value assignment/overwriting. Hash mechanics is used.
<html> <head><title>Test 1-A</title></head> <body> <script type="text/javascript"> var objectInstance = {}; var startTime = null; var measuredTimes=[]; var tmp = "";
for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { tmp = "h" + Math.floor(100000*Math.random()); objectInstance[tmp] = ""; }; measuredTimes.push((new Date).valueOf()-startTime);
Included in the operations being timed here are:-
1. Assignment to a property of the Variable/Activation object of a string value. (* 2500) 2. String concatenation. (* 2500) 3. A call to the Math.floor method (* 2500) 4. A call to the Math.random method (* 2500) 5. Multiplication (* 2500) 6. Number to string type-conversion. (* 2500) 7. The initiation speed of a Date object (* 1) 8. A call to the valueOf method of that date object (* 1) 9. Subtraction. (* 1)
- along with the assignment of a string value to a named property of an object.
If a comparison is to be made between the performance of one action in various environments it becomes important to take into account (or compensate for) any differences in performance that may be exhibited by all of the operations that are not intended to be the subject of the experiment.
For example, string concatenation is known to be an area where IE performs particularly badly, and Mozilla/Gecko particularly well. Opera is known to take an age to instantiate Date objects, and so on.
} alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs): Firefox: 31, 78, 16, 62, 47, 31, 47,31, 47, 63 Explorer: 16, 31, 47, 78, 94, 125, 140, 157, 171 Opera(!): 16, 47, 94, 125, 422, 640, 1203, 1328, 1657, 1390
The Date object is known to have a limited precision, suggested to be never better than 10 milliseconds, and the OS will only 'tick' time at a particular rate, suggested to commonly be in the range of 56 to 10 milliseconds. The practical upshot of this is that when results of timings are deduced from total periods of less than a couple of hundred milliseconds the resulting 'data' may be entirely attributed to the limitations of the measuring system.
<snip> for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { tmp = Math.floor(100000*Math.random(j)); arrayInstance[tmp] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); }
Included in the operations being timed here are:-
1. Assignment to a property of the Variable/Activation object of a numeric value. (* 2500) 2. A call to the Math.floor method (* 2500) 3. A call to the Math.random method (* 2500) 4. Multiplication (* 2500) 5. Number to string type-conversion. (* 2500, but not implicit in the property accessor) 6. The initiation speed of a Date object (* 1) 7. A call to the valueOf method of that date object (* 1) 8. Subtraction. (* 1)
- fewer operations and some slightly different operations. So deductions about the proposed subject of the 'test' derived from results that differ from the previous tests do not necessarily apply to that subject at all.
<snip> for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { objectInstance["h"+j+100000] = ""; }; measuredTimes.push((new Date).valueOf()-startTime);
Additional operations being timed here are:-
1. String concatenation (* 5000) 2. The initiation speed of a Date object (* 1) 3. A call to the valueOf method of that date object (* 1) 4. Subtraction. (* 1)
- fewer again. <snip> for (var i=0; i<10; i++) { startTime = (new Date).valueOf(); for (var j=0; j<2500; j++) { arrayInstance[j+100000] = ""; }; measuredTimes.push((new Date).valueOf()-startTime); }
And here:-
1. String concatenation (* 2500) 2. The initiation speed of a Date object (* 1) 3. A call to the valueOf method of that date object (* 1) 4. Subtraction. (* 1)
- with no less than half the string concatenation of the previous 'test'.
alert("Results: " + measuredTimes); </script> </body> </html>
Results (average of 3 runs):
It seems unlikely that an average of three runs would produced uniform integer results from three integer values. Except as a result of imprecision in the measurement mechanism that exceeded the values being measured.
Firefox: 15, 0,0, 16,0, 0,0,15,0,0 Explorer: 0, 0, 15, 0, 0, 16, 0,0,0,15 Opera: 16, 0, 0, 0, 15, 0,0,0,0,16
It should also be obvious that when 50% of results assert that 2500 loops of operations apparently take zero time (which is impossible) then the precision of the measurements must be less than the values being measured.
But longer loops would not help much when comparisons are to be made between sets of operations that differ by more than just the supposed subject of the test.
So: 1) Firefox is cool.
But why? Is it cool because it does string concatenation considerably faster than its competitors, runs Math.random faster, or because it implements its objects as some sort of HashMap instead of some sort of List (or maybe it has a lot of implementation optimisations that the test code can take advantage of)?
2) Array is the tool of choice for data manipulating (rellocation/reassignment/deletion etc.) This is what is done for by sacrificing a bit the lookup operations (reading from array is normally a bit more slow than reading key value from hash
Even with a 'testing' strategy that tells you nothing about its subject you cannot make deductions from the above about the performance of look-ups on Objects/Arrays as those operations do not explicitly appear in the 'tests' at all.
Deductions about the relative behaviour of Arrays verses Objects should not be based on tests that are not equivalent in what is being measured.
3) Hash has a great for lookup.
As per previous comments on lookups.
All its mechanics is build to get the key value as quick as possible. Update operations are the vistims of it.
Lies, damned lies and statistics. You have started with flawed preconceptions and created 'tests' that will tell you nothing useful.
Consider recent discussions of the implementation of native ECMAScript objects. The implementation details are left to the implementers and they may make any decisions they see fit so long as the resulting behaviour corresponds with the language specification. The language specification does not talk of hashing at all, so any object implementation may or may not be built upon a HashMap style of object. If it is, read/write performance in ECMAScript objects will have characteristics that resemble HashMap characteristics (be to some extent uniform regardless of the number of properties defined), and if an object is implemented as a list the read/write characteristics of that object's properties will depend on the number of properties and the location of the property in question in that list.
Well devised tests might say something about the underlying implementation, and even compare it between environments and with those objects that have been augmented into being Arrays (and so expose possible internal optimisations related to 'array index' property names), but instead you have expended time learning nothing.
Richard.
I am well aware of the "testing tool problem": in order to conduct the
test you need a testing tool, but this testing tool itself affects the
test results. I'm thinking more and more that reliable JavaScript
performance tests cannot be done from within JavaScript itself. One
needs some OS helper/wrapper. Some C++ monitor and
dispatchEvent/fireEvent notifications from browser (?). Need to think
about.
VK wrote:
<snip> I am well aware of the "testing tool problem": in order to conduct the test you need a testing tool, but this testing tool itself affects the test results. I'm thinking more and more that reliable JavaScript performance tests cannot be done from within JavaScript itself. One needs some OS helper/wrapper. Some C++ monitor and dispatchEvent/fireEvent notifications from browser (?). Need to think about.
You really couldn't reason your way out of a damp paper bag, could you?
Richard. This discussion thread is closed Replies have been disabled for this discussion. Similar topics
44 posts
views
Thread by jmoy |
last post: by
|
10 posts
views
Thread by Stephen |
last post: by
|
22 posts
views
Thread by Bradley |
last post: by
|
18 posts
views
Thread by Mark P |
last post: by
|
13 posts
views
Thread by Bern McCarty |
last post: by
|
48 posts
views
Thread by Alex Chudnovsky |
last post: by
|
1 post
views
Thread by grid |
last post: by
|
2 posts
views
Thread by Mirko Dziadzka |
last post: by
|
4 posts
views
Thread by =?Utf-8?B?SmVzcGVyLCBEZW5tYXJr?= |
last post: by
| | | | | | | | | | |