A statement like "X is most efficient when Y" generally means that one
should strive to fullfill Y in order to improve efficiency. This is
definatly not true here, adding more elements to the hash-table without
re-hashing does *not* improve lookup or insert efficiency.
There really is a lot of freedom in the definition of even the
*statement* here. When is a hash-table most efficient? are we talking
worst-case or average lookup-time, insertions? memory consumption?
latency-time on rehashes? ...
Real hash-tables lookup and insert-performance doesn't depend on how
"full" they are. They use rehashing when "too many" key-collisions
occur, and can (given that the hash-function is, what's called
"universal" -- which is a specific way of saying it's good) be proven to
provide amortized O(1) insert, lookup and removal operations using O(N)
memory.
System.Collecti ons.Hashtable is a pretty good implementation -- with an
acceptable tradeoff between space and performance, and basically you
should only think about "tuning" anything about it if you have run a
profiler and shown that the performance problem lies there.
--
Helge Jensen
mailto:he****** ****@slog.dk
sip:he********* *@slog.dk
-=> Sebastian cover-music:
http://ungdomshus.nu <=-