By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,530 Members | 891 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,530 IT Pros & Developers. It's quick & easy.

100K item data binding: Is asynchronous data binding possible?

P: n/a
Hi,

Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...

Scenario: I am rebuilding my company's Access 97 VBA database app. It pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly like
this one. Our executive employees want to be able to select from the entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.

My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or two
to finish the process.

What to do now?: Is "asynchronous databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...

What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
..NET?

Any help is appreciated. Thanks in advance.

RR

Jan 31 '08 #1
Share this Question
Share on Google+
10 Replies


P: n/a
Not sure it applies to ComboBox, but more similar controls (such as
grids), VirtualMode is an option.

For ComboBox, perhaps an AutoComplete source might be more
user-friendly than a mammoth list?

Marc
Feb 1 '08 #2

P: n/a
Have you tried using BeginUpdate() and EndUpdate()? I'm not sure these work
with data-bound combo boxes.

Have you tried adding the items to the combo box manually? The following
code executed in about 5 seconds on my laptop:

comboBox1.BeginUpdate();
try
{
comboBox1.Items.Clear();
for (int i = 1; i < 115000; i++)
comboBox1.Items.Add(i.ToString());
}
finally
{
comboBox1.EndUpdate();
}


"R Reyes" <RR****@discussions.microsoft.comwrote in message
news:2D**********************************@microsof t.com...
Hi,

Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...

Scenario: I am rebuilding my company's Access 97 VBA database app. It
pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly
like
this one. Our executive employees want to be able to select from the
entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.

My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or
two
to finish the process.

What to do now?: Is "asynchronous databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting
however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...

What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
.NET?

Any help is appreciated. Thanks in advance.

RR
Feb 1 '08 #3

P: n/a
Interesting - I wouldn't have thought a for loop would work so quickly. I
will give this a shot and let you know how it went.

Thank you, Scott.

"Scott Roberts" wrote:
Have you tried using BeginUpdate() and EndUpdate()? I'm not sure these work
with data-bound combo boxes.

Have you tried adding the items to the combo box manually? The following
code executed in about 5 seconds on my laptop:

comboBox1.BeginUpdate();
try
{
comboBox1.Items.Clear();
for (int i = 1; i < 115000; i++)
comboBox1.Items.Add(i.ToString());
}
finally
{
comboBox1.EndUpdate();
}


"R Reyes" <RR****@discussions.microsoft.comwrote in message
news:2D**********************************@microsof t.com...
Hi,

Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...

Scenario: I am rebuilding my company's Access 97 VBA database app. It
pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly
like
this one. Our executive employees want to be able to select from the
entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.

My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or
two
to finish the process.

What to do now?: Is "asynchronous databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting
however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...

What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
.NET?

Any help is appreciated. Thanks in advance.

RR

Feb 1 '08 #4

P: n/a
Not as slow, but not so fast either. I unbinded the ComboBox and added the
items manually using the .Add() function. My test ran for about 25-30
seconds down from 50-55 with the asynchronous attempt solution.

Now, as soon as I commented out the line
"cbxAccountID.Items.Add(strAccountData);" (manual BIND CODE) it ran in a few
seconds. However when I reenable that line of code, the time goes up to
25-30 seconds as expected.

What I discovered:
I'm not looping through 115,000 items but an actual DataTable/DataSet filled
with 115,000 rows. Maybe that is why it takes longer? To further
investigate, I did what you did, and used a for loop with a max count number
of 115,000. This actually increased speed up to around 20-25 seconds instead
of 25-30.

What I am thinking of doing now is maybe finding a way to convert the
DataTable/DataSet into an array or list (I don't know which is faster? Or
maybe there is something even faster than these?) and then loading THAT type
of data into the ComboBox.

Other than that, I'm not sure how you got 115,000 to load in a couple of
seconds. My computer is not slow either 3MB RAM and 3GHz...very curious.
Maybe I will also try removing all related code with that ComboBox, it's
possible it could be linked to some other processes. I'll do that now.

Thanks

"R Reyes" wrote:
Interesting - I wouldn't have thought a for loop would work so quickly. I
will give this a shot and let you know how it went.

Thank you, Scott.

"Scott Roberts" wrote:
Have you tried using BeginUpdate() and EndUpdate()? I'm not sure these work
with data-bound combo boxes.

Have you tried adding the items to the combo box manually? The following
code executed in about 5 seconds on my laptop:

comboBox1.BeginUpdate();
try
{
comboBox1.Items.Clear();
for (int i = 1; i < 115000; i++)
comboBox1.Items.Add(i.ToString());
}
finally
{
comboBox1.EndUpdate();
}


"R Reyes" <RR****@discussions.microsoft.comwrote in message
news:2D**********************************@microsof t.com...
Hi,
>
Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...
>
Scenario: I am rebuilding my company's Access 97 VBA database app. It
pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly
like
this one. Our executive employees want to be able to select from the
entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.
>
My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or
two
to finish the process.
>
What to do now?: Is "asynchronous databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting
however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...
>
What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
.NET?
>
Any help is appreciated. Thanks in advance.
>
RR
>
Feb 1 '08 #5

P: n/a
What I am thinking of doing now is maybe finding a way to convert the
DataTable/DataSet into an array or list (I don't know which is faster? Or
maybe there is something even faster than these?) and then loading THAT
type
of data into the ComboBox.
Well, you can start by not putting the data into a DataTable/DataSet in the
first place. Use a DataReader instead. Although I think you've already
established that data access isn't the problem.
Other than that, I'm not sure how you got 115,000 to load in a couple of
seconds. My computer is not slow either 3MB RAM and 3GHz...very curious.
Maybe I will also try removing all related code with that ComboBox, it's
possible it could be linked to some other processes. I'll do that now.
It wasn't a couple of seconds. I said 5 but I was guess-timating. I put a
timer on it and it's really 8 or 9 (my guess-timating is evidently not so
good). It was, however, significantly faster than 50-55 and that was my
point. I have a table with just over 285K rows and I can get it to load in
about 24 seconds. Unless I throw more hardware at it, I think that's about
as good as I'm going to get.

FWIW, here's my code:

DateTime startTime = DateTime.Now;

using (SqlConnection cn = new
SqlConnection(Sbs.Common.Framework.Data.Connection Manager.DefaultAdaptor.ConnectionString))
{
SqlCommand cmd = new SqlCommand("select * from vh", cn);
cn.Open();
SqlDataReader reader = cmd.ExecuteReader();
Month.BeginUpdate();
try
{
Month.Items.Clear();
while (reader.Read())
Month.Items.Add(reader.GetString(1));
}
finally
{
reader.Close();
Month.EndUpdate();
}
}

DateTime endTime = DateTime.Now;
TimeSpan ts = new TimeSpan(endTime.Ticks - startTime.Ticks);
MessageBox.Show("Elapsed Time: " + ts.Seconds.ToString() + "
seconds");

Feb 1 '08 #6

P: n/a
Any help is appreciated. *Thanks in advance.
>
3MB RAM is a tad low... I got 1G on my laptop
//CY
Feb 1 '08 #7

P: n/a
Oops sorry I meant 3GB of RAM, but even then it still runs about the speed I
said earlier. Also, the normal user for this application in the company will
only have about 128/256...maybe 512 at the most on some computers.

I've tried another solution which works well but it's probably not the best
way to do things. What I'm developing now is a function that lets the user
type in 2-3 plus letters in the ComboBox and a new query is executed on every
letter including the ones after the first 2-3, pulling up anywhere from
100-5k records as opposed to a huge 112,000. Queries will end up being run
every other few seconds by 50+ users so this is probably not a good idea,
though it looks like it will work once I'm done coding...

The question is, should I scrap this non efficent but "working" idea? "Make
the client happy" or "make them unhappy and force them to do it with
filters"? Keep in mind they WILL DEFINITELY think less of me if I don't do
it their way...they are not computer people and wouldn't understand (or
probably care) about efficiency on the backend, no matter what I say. They
are incredibly stubborn and do not want change. I'm sure some of you
programmers know how that can be.

So, would doing it my "working" way be bad for the server/data/etc that it's
just not worth it in the long run?

"ch*******@gmail.com" wrote:
Any help is appreciated. Thanks in advance.

3MB RAM is a tad low... I got 1G on my laptop
//CY
Feb 5 '08 #8

P: n/a

"R Reyes" <RR****@discussions.microsoft.comwrote in message
news:EB**********************************@microsof t.com...
Oops sorry I meant 3GB of RAM, but even then it still runs about the speed
I
said earlier. Also, the normal user for this application in the company
will
only have about 128/256...maybe 512 at the most on some computers.

I've tried another solution which works well but it's probably not the
best
way to do things. What I'm developing now is a function that lets the
user
type in 2-3 plus letters in the ComboBox and a new query is executed on
every
letter including the ones after the first 2-3, pulling up anywhere from
100-5k records as opposed to a huge 112,000. Queries will end up being
run
every other few seconds by 50+ users so this is probably not a good idea,
though it looks like it will work once I'm done coding...

The question is, should I scrap this non efficent but "working" idea?
"Make
the client happy" or "make them unhappy and force them to do it with
filters"? Keep in mind they WILL DEFINITELY think less of me if I don't
do
it their way...they are not computer people and wouldn't understand (or
probably care) about efficiency on the backend, no matter what I say.
They
are incredibly stubborn and do not want change. I'm sure some of you
programmers know how that can be.

So, would doing it my "working" way be bad for the server/data/etc that
it's
just not worth it in the long run?
Is the table static or dynamic? What I mean is, are records routinely
added/edited/removed from this table containing 112,000 records that they
want to "search"? If the table is static, you could just cache the data in
the client. If the data is dynamic, you'll probably be stuck with executing
SQL, but even then, if your server is fairly "beefy" and your table is
indexed properly I don't think you'll see too much of a problem. Most major
RDBMSs will cache frequently accessed data automatically, and you can
usually tell the DB to cache entire tables manually.

Feb 5 '08 #9

P: n/a
Yes, records will be added/removed/edited ALL the time.

The list will actually just get bigger and bigger (think of it as a mailing
list), which is why I figure searching by the first three characters works
great - returning only 500-2k rows.

"Scott Roberts" wrote:
>
"R Reyes" <RR****@discussions.microsoft.comwrote in message
news:EB**********************************@microsof t.com...
Oops sorry I meant 3GB of RAM, but even then it still runs about the speed
I
said earlier. Also, the normal user for this application in the company
will
only have about 128/256...maybe 512 at the most on some computers.

I've tried another solution which works well but it's probably not the
best
way to do things. What I'm developing now is a function that lets the
user
type in 2-3 plus letters in the ComboBox and a new query is executed on
every
letter including the ones after the first 2-3, pulling up anywhere from
100-5k records as opposed to a huge 112,000. Queries will end up being
run
every other few seconds by 50+ users so this is probably not a good idea,
though it looks like it will work once I'm done coding...

The question is, should I scrap this non efficent but "working" idea?
"Make
the client happy" or "make them unhappy and force them to do it with
filters"? Keep in mind they WILL DEFINITELY think less of me if I don't
do
it their way...they are not computer people and wouldn't understand (or
probably care) about efficiency on the backend, no matter what I say.
They
are incredibly stubborn and do not want change. I'm sure some of you
programmers know how that can be.

So, would doing it my "working" way be bad for the server/data/etc that
it's
just not worth it in the long run?

Is the table static or dynamic? What I mean is, are records routinely
added/edited/removed from this table containing 112,000 records that they
want to "search"? If the table is static, you could just cache the data in
the client. If the data is dynamic, you'll probably be stuck with executing
SQL, but even then, if your server is fairly "beefy" and your table is
indexed properly I don't think you'll see too much of a problem. Most major
RDBMSs will cache frequently accessed data automatically, and you can
usually tell the DB to cache entire tables manually.

Feb 5 '08 #10

P: n/a

"R Reyes" <RR****@discussions.microsoft.comwrote in message
news:0B**********************************@microsof t.com...
Yes, records will be added/removed/edited ALL the time.

The list will actually just get bigger and bigger (think of it as a
mailing
list), which is why I figure searching by the first three characters works
great - returning only 500-2k rows.
I think you might be surprised at how fast your DB server can do this
search. Just be sure it's optimized. If your site were Google or Yahoo it
might be different, but 50+ users isn't really that much.

Feb 5 '08 #11

This discussion thread is closed

Replies have been disabled for this discussion.