469,362 Members | 2,450 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,362 developers. It's quick & easy.

Is there a capacity limit in hashtables?

When I insert a long value as key and the object to a hashtable. it fails at
about 600,000 data points.

ht.add(long, long)

We are developing a real time application that is required to keep about 2.5
million objects in a hashtable and the object should be readly available to
the app very fast at any time.
Is it better to come up with an algorithm to have a hashtable of
hashtables(holding 200,000 objects each) ?

if it has limit, any other suggestions ?

Thanks
Jul 19 '05 #1
3 9357
Cengiz Dincoglu <ce*************@hotmail.com> wrote:
When I insert a long value as key and the object to a hashtable. it fails at
about 600,000 data points.
How much memory do you have? Bear in mind that each long you add (each
of key and value) is going to take up 16 bytes when boxed, and there'll
also be the reference to each of them (4 bytes each). The hashtable
also remembers the hashcode, which is another 4 bytes. The internal
capacity is also always larger than the capacity, but I still can't
understand why it would be failing after 600000 iterations... indeed,
it works fine on my box:

using System;
using System.Collections;

public class Test
{
static void Main()
{
Hashtable t = new Hashtable();
for (long i=0; i < 3000000000; i++)
{
if (i%100000==0)
{
Console.WriteLine
("i={0}, memory={1}", i, GC.GetTotalMemory(true));
}
t[i]=i;
}
}
}

Which version of .NET are you using? In an old version of 1.0, there
was a problem with the large object heap not getting garbage collected
- might that be what's wrong?
ht.add(long, long)

We are developing a real time application that is required to keep about 2.5
million objects in a hashtable and the object should be readly available to
the app very fast at any time.
Is it better to come up with an algorithm to have a hashtable of
hashtables(holding 200,000 objects each) ?

if it has limit, any other suggestions ?


Well, you'd save an awful lot of memory by writing your own custom
collection which *only* stored longs as the keys and values, if that's
really what you'll be doing.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet/
If replying to the group, please do not mail me too
Jul 19 '05 #2
Have you considered other data structures? For example, you could use store
your objects in an array (either a strongly-typed array or an ArrayList),
sort the array, and then do a binary search to look up the right items. If
you want to use the BinarySearch method of the array or ArrayList, you could
implement IComparable using the object's key value (which wouldn't be
hard.) Otherwise, you could implement your own searching algorithm.
However, this solution wouldn't be optimal if the list frequently changes,
since you'd have to re-sort the array each time the list changes.

You could also try implementing a sorted binary tree, which would also
enable both fast key lookups (using a binary search) and fast insertions and
deletions of objects. There isn't one built into the .Net framework,
though, so you'd have to build your own.

--Robert Jacobson
"Cengiz Dincoglu" <ce*************@hotmail.com> wrote in message
news:li********************@twister.nyc.rr.com...
When I insert a long value as key and the object to a hashtable. it fails at about 600,000 data points.

ht.add(long, long)

We are developing a real time application that is required to keep about 2.5 million objects in a hashtable and the object should be readly available to the app very fast at any time.
Is it better to come up with an algorithm to have a hashtable of
hashtables(holding 200,000 objects each) ?

if it has limit, any other suggestions ?

Thanks

Jul 19 '05 #3
Jon Skeet wrote:
|| Willy Denoyette [MVP] <wi*************@skynet.be> wrote:
||| Jon Skeet wrote:
||| .....
||||| Hashtable t = new Hashtable();
||||| for (long i=0; i < 3000000000; i++)
||||| {
||| .....
|||
||| Jon,
|||
||| Are you sure this runs on your system?
||
|| Oops - not to completion, no :)
||
|| However, it *does* get past the 2.5 million stage fairly easily.
||
|| --
|| Jon Skeet - <sk***@pobox.com>
|| http://www.pobox.com/~skeet/
|| If replying to the group, please do not mail me too

Otherwise you would have to tell us what HW and OS you did run this on ;-)

Willy.
Jul 19 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

reply views Thread by Oliver Spiesshofer | last post: by
1 post views Thread by Claranews | last post: by
1 post views Thread by Curtis | last post: by
3 posts views Thread by Cengiz Dincoglu | last post: by
9 posts views Thread by freduchi | last post: by
2 posts views Thread by Tem | last post: by
reply views Thread by zhoujie | last post: by
reply views Thread by suresh191 | last post: by
1 post views Thread by Marylou17 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.