I have a stumper, not sure if it belongs here or in
mozilla.dev.tech.js-engine:
<script language=javascript>
x='123'+'\0'+'456';
y='123'+'\0'+'789';
a={};
a[x]=1;
a[y]=2;
s='(x==y): '+(x==y)+'\na[x]: '+a[x]+'\na[y]: '+a[y]+'\na["123"]:
'+a["123"];
alert(s);
</script>
On Mozilla (and in JSDB which also uses Spidermonkey) this prints
(x==y): false
a[x]: 2
a[y]: 2
a["123"]: 2
It appears object indices use a different equality metric than
Javascript's string equality operator "==", namely that Javascript
strings have an overall length and don't just stop at an embedded null
character, whereas object indices seem to stop at an embedded null.
Is this behavior documented somewhere (and is it intentional or a bug)?