Have you considered other data structures? For example, you could use store
your objects in an array (either a strongly-typed array or an ArrayList),
sort the array, and then do a binary search to look up the right items. If
you want to use the BinarySearch method of the array or ArrayList, you could
implement IComparable using the object's key value (which wouldn't be
hard.) Otherwise, you could implement your own searching algorithm.
However, this solution wouldn't be optimal if the list frequently changes,
since you'd have to re-sort the array each time the list changes.
You could also try implementing a sorted binary tree, which would also
enable both fast key lookups (using a binary search) and fast insertions and
deletions of objects. There isn't one built into the .Net framework,
though, so you'd have to build your own.
--Robert Jacobson
"Cengiz Dincoglu" <ce*************@hotmail.com> wrote in message
news:li********************@twister.nyc.rr.com...
When I insert a long value as key and the object to a hashtable. it fails
at about 600,000 data points.
ht.add(long, long)
We are developing a real time application that is required to keep about
2.5 million objects in a hashtable and the object should be readly available
to the app very fast at any time.
Is it better to come up with an algorithm to have a hashtable of
hashtables(holding 200,000 objects each) ?
if it has limit, any other suggestions ?
Thanks