By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,514 Members | 1,702 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,514 IT Pros & Developers. It's quick & easy.

DB2 Tools/tuning to help performance over high network latency sites?

P: n/a
Hello again all..
We have a giant application from a giant software vendor that has very
poor SQL.
It's a PLM CAD application, that makes a call to the db for every cad
node in the assembly.
So for an assembly with many parts, from remote locations, the
performance goes out the window.
Locally, it all works fine because network latency is <1ms. But some
remote sites we have (that are growing) latency can be as high as
80-100ms.
We've opened many PMR's with the software vendor, they are somewhat
willing to not help, saying our expectations are too high. Nice..
Anyhow, what we notice when we put traces/network sniffers on to try
and figure out where the slowdown is coming from, are many many calls
to the DB with small simple SQL. Each call is initiated in it's own
bubble and is thus subjected to the 80ms wait. The speed is ok, it's
the latency that is killing it.
We've done many things to try and improve the load times, but with
little success.

Local assembly load time: 12s
Current Remote Assembly load time: 59s

The customer would like to see times in the 20s to be happy.

What I'm wondering is, are there any tools or tricks out there that
could help us? Something to collect/intercept sql and send a bunch of
SQL at once, like at a few second interval, instead of bombarding the
server with hundreds of individual calls? Tuning? Anything we can do
db2 wise to help it?

Any help is much appreciated.
Thanks

Ken

Sep 26 '07 #1
Share this Question
Share on Google+
4 Replies


P: n/a
73********@centurytel.net wrote:
Hello again all..
We have a giant application from a giant software vendor that has very
poor SQL.
It's a PLM CAD application, that makes a call to the db for every cad
node in the assembly.
So for an assembly with many parts, from remote locations, the
performance goes out the window.
Locally, it all works fine because network latency is <1ms. But some
remote sites we have (that are growing) latency can be as high as
80-100ms.
We've opened many PMR's with the software vendor, they are somewhat
willing to not help, saying our expectations are too high. Nice..
Anyhow, what we notice when we put traces/network sniffers on to try
and figure out where the slowdown is coming from, are many many calls
to the DB with small simple SQL. Each call is initiated in it's own
bubble and is thus subjected to the 80ms wait. The speed is ok, it's
the latency that is killing it.
We've done many things to try and improve the load times, but with
little success.

Local assembly load time: 12s
Current Remote Assembly load time: 59s

The customer would like to see times in the 20s to be happy.

What I'm wondering is, are there any tools or tricks out there that
could help us? Something to collect/intercept sql and send a bunch of
SQL at once, like at a few second interval, instead of bombarding the
server with hundreds of individual calls? Tuning? Anything we can do
db2 wise to help it?
I don't think it's that simple.
There are means to do batching but they have to be initiated by the
application.
The problem is that fundamentaly the app is unlikely to send a new SQL
statement before it has received the results of the previous ones. So I
doubt that problem is solvable transparently to the application.

Do you have an IBM rep whom you can engage? Perhaps throwing around a
bit more weight will help to get your app vendors attention.
If the app vendor needs help tuning it wouldn't be the first time we
help with that...

Cheers
Serge
--
Serge Rielau
DB2 Solutions Development
IBM Toronto Lab
Sep 26 '07 #2

P: n/a
On Sep 26, 11:38 am, Serge Rielau <srie...@ca.ibm.comwrote:
73k5bla...@centurytel.net wrote:
Hello again all..
We have a giant application from a giant software vendor that has very
poor SQL.
It's a PLM CAD application, that makes a call to the db for every cad
node in the assembly.
So for an assembly with many parts, from remote locations, the
performance goes out the window.
Locally, it all works fine because network latency is <1ms. But some
remote sites we have (that are growing) latency can be as high as
80-100ms.
We've opened many PMR's with the software vendor, they are somewhat
willing to not help, saying our expectations are too high. Nice..
Anyhow, what we notice when we put traces/network sniffers on to try
and figure out where the slowdown is coming from, are many many calls
to the DB with small simple SQL. Each call is initiated in it's own
bubble and is thus subjected to the 80ms wait. The speed is ok, it's
the latency that is killing it.
We've done many things to try and improve the load times, but with
little success.
Local assembly load time: 12s
Current Remote Assembly load time: 59s
The customer would like to see times in the 20s to be happy.
What I'm wondering is, are there any tools or tricks out there that
could help us? Something to collect/intercept sql and send a bunch of
SQL at once, like at a few second interval, instead of bombarding the
server with hundreds of individual calls? Tuning? Anything we can do
db2 wise to help it?

I don't think it's that simple.
There are means to do batching but they have to be initiated by the
application.
The problem is that fundamentaly the app is unlikely to send a new SQL
statement before it has received the results of the previous ones. So I
doubt that problem is solvable transparently to the application.

Do you have an IBM rep whom you can engage? Perhaps throwing around a
bit more weight will help to get your app vendors attention.
If the app vendor needs help tuning it wouldn't be the first time we
help with that...

Cheers
Serge
--
Serge Rielau
DB2 Solutions Development
IBM Toronto Lab
I am an IBM rep. Well, I'm a tech, working with many sales reps, and
no, we can't bend the app vendor. We've been trying. I've shown them
where a simple operation is taking 837 queries to the DB, when I could
get the same results in less than 10.

I've been looking into DB2 Q Replication, it seems like that might be
able to help us. I've looked at SQL replication before, but the
snapshots are too heavy to run constantly on the database master, and
thus becomes not really real time. The new Q Replication stuff looks
very interesting. Do you know why they say in hub and spoke setup Q
Replication is not so good? We have one master location, and 6 remote
sites we'd like to use this at. Remote sites mostly read, but also
would need to write as well. The only issue that really jumps out at
me is the number of fail points in the whole setup. Websphere, it's
database, mq series quues, app server, db server...adds alot of
complexity, but if it gives us the response we need, it might be worth
it.

Sep 26 '07 #3

P: n/a
73********@centurytel.net wrote:
I am an IBM rep. Well, I'm a tech, working with many sales reps, and
no, we can't bend the app vendor. We've been trying. I've shown them
where a simple operation is taking 837 queries to the DB, when I could
get the same results in less than 10.
*sigh*
I've been looking into DB2 Q Replication,
ST or email and I hook you up with Q-Rep skills.

Cheers
Serge
--
Serge Rielau
DB2 Solutions Development
IBM Toronto Lab
Sep 26 '07 #4

P: n/a
Ken,

You might also take a look at GRIDSCALE. I'm an ITS at xkoto, I can
provide you with any technical information on the product. It can also
be configured for local reads, remote-write. This may be simpler to
implement and maintain than Q-rep in a hub-and spoke architecture.

Paul LaPointe,
Systems Engineer,
xkoto Inc.
E: pa***********@xkoto.com

W: www.xkoto.com
xkoto is a winner of the 2007 Deloitte Fast 50 Companies-to-Watch
award http://www.xkoto.com/pressrelease/070920-deloitte.php
On Sep 26, 3:47 pm, Serge Rielau <srie...@ca.ibm.comwrote:
73k5bla...@centurytel.net wrote:
I am an IBM rep. Well, I'm a tech, working with many sales reps, and
no, we can't bend the app vendor. We've been trying. I've shown them
where a simple operation is taking 837 queries to the DB, when I could
get the same results in less than 10.

*sigh*
I've been looking intoDB2Q Replication,

ST or email and I hook you up with Q-Rep skills.

Cheers
Serge
--
Serge RielauDB2Solutions Development
IBM Toronto Lab

Sep 28 '07 #5

This discussion thread is closed

Replies have been disabled for this discussion.