By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
432,117 Members | 1,093 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 432,117 IT Pros & Developers. It's quick & easy.

Navigation Form with Large Number of Records - Help!

P: n/a
Hi

I have an application for my company's HR department where we store
resumes for candidates we receive. I have an application that uses
VB.Net and ADO.Net and data bindings (through code) to controls on a
Windows form. The question I have is that once the database grows and
contains a large number of records, I am worried the dataset is going
to create too much overhead when the user clicks "show all records"
(the full resume which contains a lot of text is retrieved for each
record so the dataset will contain lots of text for lots of records).
On the other hand, if I connect to the database everytime the user
clicks next / previous / first / or last, I am worried that the
application will run too slowly. Does anyone have any recomendations
as to which technique (or any others to use) will be better when the
database grows to a large (tens of thousands) number of records??
Sorry if this is a dumb question to anybody but I can't seem to find
any articles / books that address a topic like this. Thanks in advance
for your help!

Nov 21 '05 #1
Share this Question
Share on Google+
3 Replies


P: n/a
VanVee,

This is a problem from the first times of the database.
What and how do you show and how do you let the user search.

When you start showing thousands of names where he has to step through, your
user will not be happy.

Comboboxes (the approach from the "autocombobox" is often used, wherefore
are samples in this newsgroup) are made for this, however it is also
possible that you classify the records and use more comboboxes. You create
than an extra dataset with by instance two items, a name and a value (the
primary key).

This approach can mean that you keep small datasets that have to be loaded
and prevent loading from hugh amount of data.

An approach (I don't like when it is about hugh amounts of data)(this means
of course an extra dataset by instance using distinct) and setting the
relations. The bad thing with this is that you have all data in your dataset
what can mean long loading times, and when you have to update it even more
side effects when it is multiuser (and than in my idea impossible to use).

However it is very dependent from your kind of data (otherwise there would
be for sure a standard control which did that)

I hope this gives some idea's?

Cor
<va****@comcast.net> schreef in bericht
news:11*********************@z14g2000cwz.googlegro ups.com...
Hi

I have an application for my company's HR department where we store
resumes for candidates we receive. I have an application that uses
VB.Net and ADO.Net and data bindings (through code) to controls on a
Windows form. The question I have is that once the database grows and
contains a large number of records, I am worried the dataset is going
to create too much overhead when the user clicks "show all records"
(the full resume which contains a lot of text is retrieved for each
record so the dataset will contain lots of text for lots of records).
On the other hand, if I connect to the database everytime the user
clicks next / previous / first / or last, I am worried that the
application will run too slowly. Does anyone have any recomendations
as to which technique (or any others to use) will be better when the
database grows to a large (tens of thousands) number of records??
Sorry if this is a dumb question to anybody but I can't seem to find
any articles / books that address a topic like this. Thanks in advance
for your help!

Nov 21 '05 #2

P: n/a
Hi Cor

Thanks for the message. Yes, I agree with your post. It is tough! I
have reviewed samples of auto-comboboxes too. Do you think that
loading a huge dataset into memory though (outside of a potentially
long load time) will cause a memory error because so much memory is
being used (and again, I realize this all depends but for rather larger
files with large numbers, is there a point where memory can't handle a
dataset...with modern machines at least?)

Also, yes, the concurrency becomes a major headache with all fo the
data detached. Oh well, any other thoughts would be great. I guess
I'll try a couple of techniques and see what the users think!

Thanks

Nov 21 '05 #3

P: n/a
VanVee,

In my opinion did you with your questions answer them as well yourself

Without the memory part of course, however how should we calculate that.
To much is to much, however when will that reach about what hardware are we
talking etc.

When it is so much that it is reaching the top of the computer, you are in
my opinion really on the wrong way.

:-)

Just my thought,

Cor

Thanks for the message. Yes, I agree with your post. It is tough! I
have reviewed samples of auto-comboboxes too. Do you think that
loading a huge dataset into memory though (outside of a potentially
long load time) will cause a memory error because so much memory is
being used (and again, I realize this all depends but for rather larger
files with large numbers, is there a point where memory can't handle a
dataset...with modern machines at least?)

Also, yes, the concurrency becomes a major headache with all fo the
data detached. Oh well, any other thoughts would be great. I guess
I'll try a couple of techniques and see what the users think!

Thanks

Nov 21 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.