By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,515 Members | 1,042 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,515 IT Pros & Developers. It's quick & easy.

How to "append" a datatable?

P: n/a
Hello!

I have huge unperformant "where like" querys to a mysql DB. So there
are some performance issues. To resolve my performance problem I thougt
to split the query into threads. I hold 10 connections to my database
and each connection queries in its own thread as the following:

conn1: select fname from root where p_key between 1 and 20000 fname
like '%somestring%';
conn2: select fname from root where p_key between 20001 and 40000 fname
like '%somestring%';
and so on ...

the root table contains today about 200000 rows growing

Now the problem is: How can I put the 10 datatable objects to one huge
object to bind to my grid object?

THX!
Christian Maier

PS Any other solution for my performance issue is welcome!

Apr 5 '06 #1
Share this Question
Share on Google+
6 Replies


P: n/a
do you present the hole data (the 200000 records) to the user once?
don't you page it?

if you page it... why not just get the data (100 at a time) to populate the
current datagrid... there's some tutorials on it.
--
Thank you in Advance.

Bruno Alexandre
(a Portuguese in Denmark)

"Christian Maier" <to*******@freesurf.fr> escreveu na mensagem
news:11*********************@g10g2000cwb.googlegro ups.com...
Hello!

I have huge unperformant "where like" querys to a mysql DB. So there
are some performance issues. To resolve my performance problem I thougt
to split the query into threads. I hold 10 connections to my database
and each connection queries in its own thread as the following:

conn1: select fname from root where p_key between 1 and 20000 fname
like '%somestring%';
conn2: select fname from root where p_key between 20001 and 40000 fname
like '%somestring%';
and so on ...

the root table contains today about 200000 rows growing

Now the problem is: How can I put the 10 datatable objects to one huge
object to bind to my grid object?

THX!
Christian Maier

PS Any other solution for my performance issue is welcome!

Apr 5 '06 #2

P: n/a
> do you present the hole data (the 200000 records) to the user once?
don't you page it?

the query is over 200000 records, the result should be <= 500 selects.
so the problem is not getting huge data into my datagrid, the problem
is the query it self. maybe there are only 10 selects from each thread.
but i do not know how to "union" all retured datatables out of my
threads.

christian

Apr 5 '06 #3

P: n/a
Hi Christian,

"Christian Maier" <to*******@freesurf.fr> schrieb im Newsbeitrag
news:11*********************@g10g2000cwb.googlegro ups.com...
I have huge unperformant "where like" querys to a mysql DB. So there
are some performance issues. To resolve my performance problem I thougt
to split the query into threads. I hold 10 connections to my database
and each connection queries in its own thread as the following:


in my opinion you have to find out what causes this performance leak!

Is it:
- your app (the way it handles the resulting datatables, etc.)
- the query
- the database connection (drivers, network, etc.)
- the database itself (old or buggy, heavy load, etc.)

If you know what causes the leak you'll have a greater chance solving it.

If something's wrong with the query, the database, the database connection
then I don't think it makes anything faster querying it with multiple
threads.

Just ideas....

Greets,
Kurt
Apr 5 '06 #4

P: n/a
I have found that if you don't invoke Table.Clear then the DataAdapter will
append data to it. Interestingly, however, if it were the same records,
then the records get replaced. If this doesn't work for you, try creating
the dataset as a file in the project and load data to it, that's how I do
it.

"Christian Maier" <to*******@freesurf.fr> wrote in message
news:11*********************@g10g2000cwb.googlegro ups.com...
Hello!

I have huge unperformant "where like" querys to a mysql DB. So there
are some performance issues. To resolve my performance problem I thougt
to split the query into threads. I hold 10 connections to my database
and each connection queries in its own thread as the following:

conn1: select fname from root where p_key between 1 and 20000 fname
like '%somestring%';
conn2: select fname from root where p_key between 20001 and 40000 fname
like '%somestring%';
and so on ...

the root table contains today about 200000 rows growing

Now the problem is: How can I put the 10 datatable objects to one huge
object to bind to my grid object?

THX!
Christian Maier

PS Any other solution for my performance issue is welcome!

Apr 6 '06 #5

P: n/a

+Vice schrieb:
I have found that if you don't invoke Table.Clear then the DataAdapter will
append data to it. Interestingly, however, if it were the same records,
then the records get replaced. If this doesn't work for you, try creating
the dataset as a file in the project and load data to it, that's how I do
it.


cool thank you for the hint!

Christian

Apr 6 '06 #6

P: n/a
> in my opinion you have to find out what causes this performance leak!
yea I did!
- your app (the way it handles the resulting datatables, etc.)
- the query
- the database connection (drivers, network, etc.)
- the database itself (old or buggy, heavy load, etc.) it is the query because if you select * where foo like '%something'
every index on foo is useless!
just if you query where foo like 'something%' you can tune the query
with an index.

when I select * where pkey between 1 and 10000 and foo like
'%something'
then the optimizer goes trough the primary key. but you are right, the
boost is in a milliseconds range.

I tried to make a hash table but you can imagine that there is not
enough memory allowed to allocate by the database.

I am thinking about select the whole data into a textfile and do in
instr() to find the lines I need but the file will be 50 MB at least I
do not know if this meight be faster.

thx!
Christian

If you know what causes the leak you'll have a greater chance solving it.

If something's wrong with the query, the database, the database connection
then I don't think it makes anything faster querying it with multiple
threads.

Just ideas....

Greets,
Kurt


Apr 6 '06 #7

This discussion thread is closed

Replies have been disabled for this discussion.