473,725 Members | 1,942 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

100K item data binding: Is asynchronous data binding possible?

Hi,

Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...

Scenario: I am rebuilding my company's Access 97 VBA database app. It pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly like
this one. Our executive employees want to be able to select from the entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.

My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or two
to finish the process.

What to do now?: Is "asynchrono us databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...

What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
..NET?

Any help is appreciated. Thanks in advance.

RR

Jan 31 '08 #1
10 5842
Not sure it applies to ComboBox, but more similar controls (such as
grids), VirtualMode is an option.

For ComboBox, perhaps an AutoComplete source might be more
user-friendly than a mammoth list?

Marc
Feb 1 '08 #2
Have you tried using BeginUpdate() and EndUpdate()? I'm not sure these work
with data-bound combo boxes.

Have you tried adding the items to the combo box manually? The following
code executed in about 5 seconds on my laptop:

comboBox1.Begin Update();
try
{
comboBox1.Items .Clear();
for (int i = 1; i < 115000; i++)
comboBox1.Items .Add(i.ToString ());
}
finally
{
comboBox1.EndUp date();
}


"R Reyes" <RR****@discuss ions.microsoft. comwrote in message
news:2D******** *************** ***********@mic rosoft.com...
Hi,

Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...

Scenario: I am rebuilding my company's Access 97 VBA database app. It
pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly
like
this one. Our executive employees want to be able to select from the
entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.

My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or
two
to finish the process.

What to do now?: Is "asynchrono us databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting
however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...

What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
.NET?

Any help is appreciated. Thanks in advance.

RR
Feb 1 '08 #3
Interesting - I wouldn't have thought a for loop would work so quickly. I
will give this a shot and let you know how it went.

Thank you, Scott.

"Scott Roberts" wrote:
Have you tried using BeginUpdate() and EndUpdate()? I'm not sure these work
with data-bound combo boxes.

Have you tried adding the items to the combo box manually? The following
code executed in about 5 seconds on my laptop:

comboBox1.Begin Update();
try
{
comboBox1.Items .Clear();
for (int i = 1; i < 115000; i++)
comboBox1.Items .Add(i.ToString ());
}
finally
{
comboBox1.EndUp date();
}


"R Reyes" <RR****@discuss ions.microsoft. comwrote in message
news:2D******** *************** ***********@mic rosoft.com...
Hi,

Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...

Scenario: I am rebuilding my company's Access 97 VBA database app. It
pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly
like
this one. Our executive employees want to be able to select from the
entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.

My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or
two
to finish the process.

What to do now?: Is "asynchrono us databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting
however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...

What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
.NET?

Any help is appreciated. Thanks in advance.

RR

Feb 1 '08 #4
Not as slow, but not so fast either. I unbinded the ComboBox and added the
items manually using the .Add() function. My test ran for about 25-30
seconds down from 50-55 with the asynchronous attempt solution.

Now, as soon as I commented out the line
"cbxAccountID.I tems.Add(strAcc ountData);" (manual BIND CODE) it ran in a few
seconds. However when I reenable that line of code, the time goes up to
25-30 seconds as expected.

What I discovered:
I'm not looping through 115,000 items but an actual DataTable/DataSet filled
with 115,000 rows. Maybe that is why it takes longer? To further
investigate, I did what you did, and used a for loop with a max count number
of 115,000. This actually increased speed up to around 20-25 seconds instead
of 25-30.

What I am thinking of doing now is maybe finding a way to convert the
DataTable/DataSet into an array or list (I don't know which is faster? Or
maybe there is something even faster than these?) and then loading THAT type
of data into the ComboBox.

Other than that, I'm not sure how you got 115,000 to load in a couple of
seconds. My computer is not slow either 3MB RAM and 3GHz...very curious.
Maybe I will also try removing all related code with that ComboBox, it's
possible it could be linked to some other processes. I'll do that now.

Thanks

"R Reyes" wrote:
Interesting - I wouldn't have thought a for loop would work so quickly. I
will give this a shot and let you know how it went.

Thank you, Scott.

"Scott Roberts" wrote:
Have you tried using BeginUpdate() and EndUpdate()? I'm not sure these work
with data-bound combo boxes.

Have you tried adding the items to the combo box manually? The following
code executed in about 5 seconds on my laptop:

comboBox1.Begin Update();
try
{
comboBox1.Items .Clear();
for (int i = 1; i < 115000; i++)
comboBox1.Items .Add(i.ToString ());
}
finally
{
comboBox1.EndUp date();
}


"R Reyes" <RR****@discuss ions.microsoft. comwrote in message
news:2D******** *************** ***********@mic rosoft.com...
Hi,
>
Problem: How can I databind (or put) a SqlServer query's row return of
115,000 items into a ComboBox quickly? Not much longer than a matter of
seconds, that is...
>
Scenario: I am rebuilding my company's Access 97 VBA database app. It
pulls
115,000 items (of account names) from SqlServer and the data is bound to a
single Access 97 ComboBox control. My C# version needs to work exactly
like
this one. Our executive employees want to be able to select from the
entire
list of 115,000 items in the Combo as it is in Access 97 app. They prefer
not to use filters to populate the ComboBox with less items as it "slows"
down their production by having to take extra steps.
>
My attempt using C#.NET: Currently, I can databind the 115,000 objects to
the ComboBox and as you can already guess it takes at least a minute or
two
to finish the process.
>
What to do now?: Is "asynchrono us databinding" possible? I see tons of
articles on asynchronous programming including progressbar reporting
however
none of these articles explain how to use databinding into the picture to
speed up load times. Or is that not even how it works...
>
What are my alternatives to getting these 115,000 items to populate the
ComboBox in a matter of seconds as it does in the Access 97 program? Does
anyone even know why it's so quick to databind in Access 97 but so slow in
.NET?
>
Any help is appreciated. Thanks in advance.
>
RR
>
Feb 1 '08 #5
What I am thinking of doing now is maybe finding a way to convert the
DataTable/DataSet into an array or list (I don't know which is faster? Or
maybe there is something even faster than these?) and then loading THAT
type
of data into the ComboBox.
Well, you can start by not putting the data into a DataTable/DataSet in the
first place. Use a DataReader instead. Although I think you've already
established that data access isn't the problem.
Other than that, I'm not sure how you got 115,000 to load in a couple of
seconds. My computer is not slow either 3MB RAM and 3GHz...very curious.
Maybe I will also try removing all related code with that ComboBox, it's
possible it could be linked to some other processes. I'll do that now.
It wasn't a couple of seconds. I said 5 but I was guess-timating. I put a
timer on it and it's really 8 or 9 (my guess-timating is evidently not so
good). It was, however, significantly faster than 50-55 and that was my
point. I have a table with just over 285K rows and I can get it to load in
about 24 seconds. Unless I throw more hardware at it, I think that's about
as good as I'm going to get.

FWIW, here's my code:

DateTime startTime = DateTime.Now;

using (SqlConnection cn = new
SqlConnection(S bs.Common.Frame work.Data.Conne ctionManager.De faultAdaptor.Co nnectionString) )
{
SqlCommand cmd = new SqlCommand("sel ect * from vh", cn);
cn.Open();
SqlDataReader reader = cmd.ExecuteRead er();
Month.BeginUpda te();
try
{
Month.Items.Cle ar();
while (reader.Read())
Month.Items.Add (reader.GetStri ng(1));
}
finally
{
reader.Close();
Month.EndUpdate ();
}
}

DateTime endTime = DateTime.Now;
TimeSpan ts = new TimeSpan(endTim e.Ticks - startTime.Ticks );
MessageBox.Show ("Elapsed Time: " + ts.Seconds.ToSt ring() + "
seconds");

Feb 1 '08 #6
Any help is appreciated. *Thanks in advance.
>
3MB RAM is a tad low... I got 1G on my laptop
//CY
Feb 1 '08 #7
Oops sorry I meant 3GB of RAM, but even then it still runs about the speed I
said earlier. Also, the normal user for this application in the company will
only have about 128/256...maybe 512 at the most on some computers.

I've tried another solution which works well but it's probably not the best
way to do things. What I'm developing now is a function that lets the user
type in 2-3 plus letters in the ComboBox and a new query is executed on every
letter including the ones after the first 2-3, pulling up anywhere from
100-5k records as opposed to a huge 112,000. Queries will end up being run
every other few seconds by 50+ users so this is probably not a good idea,
though it looks like it will work once I'm done coding...

The question is, should I scrap this non efficent but "working" idea? "Make
the client happy" or "make them unhappy and force them to do it with
filters"? Keep in mind they WILL DEFINITELY think less of me if I don't do
it their way...they are not computer people and wouldn't understand (or
probably care) about efficiency on the backend, no matter what I say. They
are incredibly stubborn and do not want change. I'm sure some of you
programmers know how that can be.

So, would doing it my "working" way be bad for the server/data/etc that it's
just not worth it in the long run?

"ch*******@gmai l.com" wrote:
Any help is appreciated. Thanks in advance.

3MB RAM is a tad low... I got 1G on my laptop
//CY
Feb 5 '08 #8

"R Reyes" <RR****@discuss ions.microsoft. comwrote in message
news:EB******** *************** ***********@mic rosoft.com...
Oops sorry I meant 3GB of RAM, but even then it still runs about the speed
I
said earlier. Also, the normal user for this application in the company
will
only have about 128/256...maybe 512 at the most on some computers.

I've tried another solution which works well but it's probably not the
best
way to do things. What I'm developing now is a function that lets the
user
type in 2-3 plus letters in the ComboBox and a new query is executed on
every
letter including the ones after the first 2-3, pulling up anywhere from
100-5k records as opposed to a huge 112,000. Queries will end up being
run
every other few seconds by 50+ users so this is probably not a good idea,
though it looks like it will work once I'm done coding...

The question is, should I scrap this non efficent but "working" idea?
"Make
the client happy" or "make them unhappy and force them to do it with
filters"? Keep in mind they WILL DEFINITELY think less of me if I don't
do
it their way...they are not computer people and wouldn't understand (or
probably care) about efficiency on the backend, no matter what I say.
They
are incredibly stubborn and do not want change. I'm sure some of you
programmers know how that can be.

So, would doing it my "working" way be bad for the server/data/etc that
it's
just not worth it in the long run?
Is the table static or dynamic? What I mean is, are records routinely
added/edited/removed from this table containing 112,000 records that they
want to "search"? If the table is static, you could just cache the data in
the client. If the data is dynamic, you'll probably be stuck with executing
SQL, but even then, if your server is fairly "beefy" and your table is
indexed properly I don't think you'll see too much of a problem. Most major
RDBMSs will cache frequently accessed data automatically, and you can
usually tell the DB to cache entire tables manually.

Feb 5 '08 #9
Yes, records will be added/removed/edited ALL the time.

The list will actually just get bigger and bigger (think of it as a mailing
list), which is why I figure searching by the first three characters works
great - returning only 500-2k rows.

"Scott Roberts" wrote:
>
"R Reyes" <RR****@discuss ions.microsoft. comwrote in message
news:EB******** *************** ***********@mic rosoft.com...
Oops sorry I meant 3GB of RAM, but even then it still runs about the speed
I
said earlier. Also, the normal user for this application in the company
will
only have about 128/256...maybe 512 at the most on some computers.

I've tried another solution which works well but it's probably not the
best
way to do things. What I'm developing now is a function that lets the
user
type in 2-3 plus letters in the ComboBox and a new query is executed on
every
letter including the ones after the first 2-3, pulling up anywhere from
100-5k records as opposed to a huge 112,000. Queries will end up being
run
every other few seconds by 50+ users so this is probably not a good idea,
though it looks like it will work once I'm done coding...

The question is, should I scrap this non efficent but "working" idea?
"Make
the client happy" or "make them unhappy and force them to do it with
filters"? Keep in mind they WILL DEFINITELY think less of me if I don't
do
it their way...they are not computer people and wouldn't understand (or
probably care) about efficiency on the backend, no matter what I say.
They
are incredibly stubborn and do not want change. I'm sure some of you
programmers know how that can be.

So, would doing it my "working" way be bad for the server/data/etc that
it's
just not worth it in the long run?

Is the table static or dynamic? What I mean is, are records routinely
added/edited/removed from this table containing 112,000 records that they
want to "search"? If the table is static, you could just cache the data in
the client. If the data is dynamic, you'll probably be stuck with executing
SQL, but even then, if your server is fairly "beefy" and your table is
indexed properly I don't think you'll see too much of a problem. Most major
RDBMSs will cache frequently accessed data automatically, and you can
usually tell the DB to cache entire tables manually.

Feb 5 '08 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
2503
by: pmud | last post by:
Hi, I need to display columns in a data grid based on 7 different queries. Now I have 32 questions: 1. Is it possble to have 1 single data adapter with 7 queries & 1 data set or do I need to have a separate data adapter & a separate data set for each select query? If thats possible then how?
0
1468
by: Bing | last post by:
Hi there, I am using the DB2 universal JDBC driver type 4 to insert BLOBs into a DB2 database. The Method I used for supplying the BLOB data value is setBinaryStream(). Everything works fine as long as the binary data is less than or equal to 100k. Anything larger than that (even for one byte) will result an exception saying the value is too large. All the DB2 samples (at least two)I have found have a limit of 100k (magic number!!!),...
1
2598
by: matty.hall | last post by:
There's a lot of information out there about data-binding UI objects (i.e. derived from Control) to non-UI custom business objects. Is it possible to do the same without any UI being involved at all? Here's an example: I want to do data binding on some of the properties of a TreeNode (namely its Name). Unfortunately, TreeNode does not derive from Control, so it doesn't have the "stock" data binding stuff. I'd like to create a new...
1
2190
by: Shahid Saleem | last post by:
Dear Members I want to generate a Template using DataList and i want to user Grid control in the item template, is it possible. if yes need some explanation regarding data binding of grid control in the data list. Best Regards Shahid Saleem *** Sent via Developersdex http://www.developersdex.com ***
3
6221
by: washoetech | last post by:
I have a gridview control. In this grid view there is a column for the price of an item. Some of the prices have a dollar sign in front of it and some dont. How do I get rid of the dollar sign if it is in front of the value? My guess would be to use a template column but I dont know how to go about this. Any ideas? Below is an example of what the data looks like raw from the database: $456.95 200.89
8
1667
by: Joel Reinford | last post by:
I would like to build a class that has properties which can be accessed by string names or index numbers in the form of MyClass.Item("LastName"). The string names or item index values would be populated by a data-driven loop. I need a few pointers to get me started in the right direction. I'm pretty sure that I need a default Item property but I'm not sure how to create that or index the other properties. A short code example is below ...
9
3601
by: Macca | last post by:
Hi, I have a synchronous socket server which my app uses to read data from clients. To test this I have a simulated client that sends 100 byte packets. I have set up the socket server so that its buffer is bigger than this. I did this expecting the data to be read in one pass.
5
4274
by: Eric Cadwell | last post by:
Is there a faster way to write the last 100K of a large file? This code takes almost two minutes to run on my machine. int buffer = 100000; int length = 2000000000; string file = @"C:\test.txt"; if (File.Exists(file)) File.Delete(file);
22
1809
by: a | last post by:
By previous replies, it seems that the following method somehow solves the problem up to 1000 * 1000 2D data, but when I try 10k * 10k, the segmentation fault problem appears again. Richard Tobin told me there is a system limit that can be changed. But I don't know which file is to be changed. I have modified again and again and hope to find out a solution that can handle 100k * 100k data.
0
8888
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9401
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9257
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9111
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8096
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
4517
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
4782
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
2634
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2157
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.