473,769 Members | 7,584 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Application Performance advice please?

Hello all,

I've been recruited to assist in diagnosing and fixing a performance problem
on an application we have running on SQL Server 7.
The application itself is third party software, so we can't get at the
source code. It's a Client Management system, where consultants all over
the
country track their client meetings, results, action plans, etc. , and has
apparently been problematic for a long time now. I came into this
investigation
in mid-stream, but here's the situation as I understand it:

We have users reporting it's slow, with no discernable pattern with respect
to what part of the application they're using or now particular time of day.
I am told that it doesn't appear to be a bandwith or computer resource
problem. They apparently added two app servers a year or so ago, which
temporarily
improved the performance. We're using a nominal percentage of CPU and
memory.

There are three large tables (approx 8 million rows) that are queried often,
as users click to see their calendar of appointments or review past meetings
with a client, etc. The activity on these tables is over 90% reads
(SELECTS) with about 10% INSERTS/UPDATES. We have attempted to run the Index
Analyzer Wizard twice
but so far it just seems to hang (it could be that the workload file is too
big?) . So, what we're doing now is isolating the SELECT statements that
take a long time to run and manually comparing them to the indexes that
exist on these large tables. Since we can't alter the SQL source code,
we're trying to alter the indexes to improve performance.

What I would like to know is, is there a good way to get benchmark
measurements so we can explicitly measure any performance changes? Also, do
you think
we're going about this the right way, or is there some other avenue we could
be looking at to improve performance?

I recognize that performance questions are tricky to post/answer in a
newsgroup, because usually you need more information than is provided. The
problem is
that this is a high profile investigation (they're hauling us into meetings
every two days to report our progress) and I need to be able to convincingly
state that we have either improved performance by X% , or that it is the
application itself that's the problem and we're stuck with it.

Any thoughts would be deeply appreciated.

Thanks and best regards,

Steve
Jul 20 '05 #1
5 1737

"Steve_CA" <st********@yah oo.com> wrote in message
news:35******** ************@ne ws20.bellglobal .com...
Hello all,

I've been recruited to assist in diagnosing and fixing a performance
problem
on an application we have running on SQL Server 7.
The application itself is third party software, so we can't get at the
source code. It's a Client Management system, where consultants all over
the
country track their client meetings, results, action plans, etc. , and has
apparently been problematic for a long time now. I came into this
investigation
in mid-stream, but here's the situation as I understand it:

We have users reporting it's slow, with no discernable pattern with
respect
to what part of the application they're using or now particular time of
day.
I am told that it doesn't appear to be a bandwith or computer resource
problem. They apparently added two app servers a year or so ago, which
temporarily
improved the performance. We're using a nominal percentage of CPU and
memory.

There are three large tables (approx 8 million rows) that are queried
often,
as users click to see their calendar of appointments or review past
meetings
with a client, etc. The activity on these tables is over 90% reads
(SELECTS) with about 10% INSERTS/UPDATES. We have attempted to run the
Index
Analyzer Wizard twice
but so far it just seems to hang (it could be that the workload file is
too
big?) . So, what we're doing now is isolating the SELECT statements that
take a long time to run and manually comparing them to the indexes that
exist on these large tables. Since we can't alter the SQL source code,
we're trying to alter the indexes to improve performance.

What I would like to know is, is there a good way to get benchmark
measurements so we can explicitly measure any performance changes? Also,
do
you think
we're going about this the right way, or is there some other avenue we
could
be looking at to improve performance?

I recognize that performance questions are tricky to post/answer in a
newsgroup, because usually you need more information than is provided.
The
problem is
that this is a high profile investigation (they're hauling us into
meetings
every two days to report our progress) and I need to be able to
convincingly
state that we have either improved performance by X% , or that it is the
application itself that's the problem and we're stuck with it.

Any thoughts would be deeply appreciated.

Thanks and best regards,

Steve


Your first stop should probably be Profiler, where you can gather lots of
information about the TSQL being executed on the server, along with
durations, I/O cost, query plans etc. And of course Perfmon for checking if
MSSQL is hitting I/O or CPU limits - if you've just joined the
investigation, you should probably satisfy yourself about that, especially
if you're now the person more or less responsible for resolving the
situation.

If it's a third-party app, then it's going to be awkward to find a
resolution, as you say. Indexes are probably the only thing you can change,
any even then you might find you've invalidated your support agreement by
doing so. But certainly, gathering information and establishing where the
bottleneck is (if there is one) should be the first step.

Simon
Jul 20 '05 #2
On Mon, 11 Oct 2004 08:34:12 -0400, Steve_CA wrote:
We have users reporting it's slow, with no discernable pattern with respect
to what part of the application they're using or now particular time of day.


Hi Steve,

In addition to Simon's suggestion, I'd look into possible locking problems
as well. Just add locking events to the Profiler trace already suggested
by Simon.

Best, Hugo
--

(Remove _NO_ and _SPAM_ to get my e-mail address)
Jul 20 '05 #3
Can you capture some of the queries that run slowly ? I imagine that a
number of them will consistently perform slowly. If so, try running
these through QA and look at the execution plan. It should indicate
where the performance hit is. You may find that this is table scans
etc...

Also, you should run the index analysis against these 'common' queries
to see if it comes up with any suggestions.

If it's locking/blocking then
http://www.sommarskog.se/sqlutil/aba_lockinfo.html will likely be of
some help

Finally, a bit of a long shot as it sounds similar to a problem we
had.

Are any views used ? Can these be improved by swapping to tables
populated by stored procedures ? You can keep the naming the same but
swap them over. To give you an example, we had 4 views which were
referenced by a final view (all were complex). Our app needed to look
at the data, but I didn't want to change the code. I swapped the view
to a table which was generated by a copy of the original view from an
SP. This saved a lot of time (1 min from startup down to 5 seconds)
with the SP being run every 10 mins on the server. This combined with
better indexing and it's saving an hour a day for them. OK, it's a
quick overview, but I'd see if you can get away with any little tricks
like this (especially if in certain circumstances you only need to
read the data). Might be worth a try.

Ryan

Hugo Kornelis <hugo@pe_NO_rFa ct.in_SPAM_fo> wrote in message news:<gh******* *************** **********@4ax. com>...
On Mon, 11 Oct 2004 08:34:12 -0400, Steve_CA wrote:
We have users reporting it's slow, with no discernable pattern with respect
to what part of the application they're using or now particular time of day.


Hi Steve,

In addition to Simon's suggestion, I'd look into possible locking problems
as well. Just add locking events to the Profiler trace already suggested
by Simon.

Best, Hugo

Jul 20 '05 #4
Thank you all,

Ryan, your suggestion regarding views is very relevant....one of the main
culprits is a commonly used view
and I thought we were dead in the water with version 7 (I know you can
create indexes on views in 2000).

"Ryan" <ry********@hot mail.com> wrote in message
news:78******** *************** **@posting.goog le.com...
Can you capture some of the queries that run slowly ? I imagine that a
number of them will consistently perform slowly. If so, try running
these through QA and look at the execution plan. It should indicate
where the performance hit is. You may find that this is table scans
etc...

Also, you should run the index analysis against these 'common' queries
to see if it comes up with any suggestions.

If it's locking/blocking then
http://www.sommarskog.se/sqlutil/aba_lockinfo.html will likely be of
some help

Finally, a bit of a long shot as it sounds similar to a problem we
had.

Are any views used ? Can these be improved by swapping to tables
populated by stored procedures ? You can keep the naming the same but
swap them over. To give you an example, we had 4 views which were
referenced by a final view (all were complex). Our app needed to look
at the data, but I didn't want to change the code. I swapped the view
to a table which was generated by a copy of the original view from an
SP. This saved a lot of time (1 min from startup down to 5 seconds)
with the SP being run every 10 mins on the server. This combined with
better indexing and it's saving an hour a day for them. OK, it's a
quick overview, but I'd see if you can get away with any little tricks
like this (especially if in certain circumstances you only need to
read the data). Might be worth a try.

Ryan

Hugo Kornelis <hugo@pe_NO_rFa ct.in_SPAM_fo> wrote in message

news:<gh******* *************** **********@4ax. com>...
On Mon, 11 Oct 2004 08:34:12 -0400, Steve_CA wrote:
We have users reporting it's slow, with no discernable pattern with respectto what part of the application they're using or now particular time of
day.
Hi Steve,

In addition to Simon's suggestion, I'd look into possible locking problems as well. Just add locking events to the Profiler trace already suggested
by Simon.

Best, Hugo

Jul 20 '05 #5
Steve,

Got your e-mail (replied directly as well). Essentially, that's what I
meant. Posting in NG in case it helps someone else later.

The scenario that we had with this was that we had a view which gathered
the data from 4 other views. Each of the 4 underlying views were a
little complex and took a while to run. Combined in the ‘top level’
view, the performance was slow. The application ran a simple summary
query of the top level view when opening. This took about 50 seconds to
run which was unacceptable to the users.

Imagine… (obviously change the names to something more meaningful)

myView1, myView2, myView3, myView4 all feed into myTopView. The
application looks to query the object myTopView.

I renamed the view from myTopView to myTopViewFull.

I created a table called myTopViewTable (the same name as the view and
the same structure as the data returned from the original view –
myTopView).

Then I created a view called myTopView (same name as my original top
level view) which pointed to the table myTopViewTable instead of the 4
lower level views. Simple ‘select *’ will do it.

All I had to do was a very simple SP which truncated the table
myTopViewTable and inserted the data from myTopViewFull. This still took
a while to run, but it only takes the hit on the server and the user
perception is changed and they think it’s run quicker. Instead of a
bunch of users all doing the same thing at a 50 second cost to each of
them, we only have a 50 second cost on the server every 10 minutes.

For our needs, it doesn’t matter for this data if we update it every 10
minutes, but the difference to the users was quite noticeable. The time
to open the application dropped from 50 seconds plus to consistently
lower than 5 seconds. I did add some indexes to the main tables
(referenced by the view) to try and speed things up and it helped a
little. I found more performance improvements in the application though
as a result.

The main benefit of this is shifting the perception of work from the
users to the server. You can also try using OPTION (NO LOCK) on the
select statements to reduce locking / blocking issues as you are taking
the data into a table directly.

Also, as SQL tables / views cannot have the same name, if you have a
view that runs far too slowly, you can sometimes get a bit of a
performance advantage by doing this. It’s crude, but it works provided
you apply it correctly. No changes to the application should be needed.

A simple test is, drop the data from the view into a table. Try a query
that you know takes a while against the view and against the table. See
if there is a performance improvement. Oh, and you could try indexes on
the new table provided you don’t take a hit re-building them.

Indexed views are possibly another option though.

Hope that helps

Ryan

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Jul 20 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

8
2646
by: Tom Siltwater | last post by:
building variable width/DB tables etc using getrows instead of movenext. Performance is a major concern as this app requires SSL. My question is, when does it become more about the challenge of building faster apps vs. getting the job done??? If my calculations are correct, I just added an extra 10,000+ possible hits within a 12 hour day or so.. an extra 10,000 hits???!!! What percentage of applications do you think will ever see this...
8
4718
by: Uttam | last post by:
Hello, I am currently in the process of developing an application in a pure desktop world using Access 2000. I am intending to convert this pure desktop application into a Client Server application with Access 2000 as the front end and the each of the following as the backend: 1) Oracle
0
1251
by: CodeMonkey | last post by:
Hi, I was wondering if anybody could give me some advice on how to structure an application that I am about to start work on. The vague requirements will be: 1) Always-On monitoring of data (e.g. through serial port and other sources) 2) Local machine user interface 3) Remote machine user interface (networked)
1
1011
by: Foo | last post by:
Hi I have a question. How much (performance) does it cost to keep business logic in web service and to use it through client(asp.net). Is the flexibility worth the performance??? Or is it better to put all the logic to asp.net? Can't make my mind, please help. Looking forward for your advice and opinion;-) Foo
6
2009
by: Simon Harvey | last post by:
Hi everyone, We have a need to make a Windows Forms (2.0) client application that will be installed on our clients site. The data that the application uses needs to be centrally available to a potentially large number of other sites, which would seem to leave us with our traditional approach, or having a central database server on a dedicated server someplace.
4
1519
by: skotapal | last post by:
Hello I manage a web based VB .net application. This application has 3 components: 1. Webapp (this calls the executibles) 2. database 3. business logic is contained in individual exe application that get called in a sequence to do some heavy calculations (mainly DB operations with in memory datasets)
15
4916
by: kostas | last post by:
Hi, Trying to dump a set<int(s) in a vector (v) my guess was that v.insert(v.end(),s.begin(),s.end()) would be at least as efficient as copy(s.begin(), s.end(), back_inserter(v)) It turn out that it is almost two times slower (even without using reserve() before copy) and that's because it traverses the set twice. I'm posting it as an exception to the usual (and usually correct) advise "Prefer member functions to algorithms".
11
4889
by: Jeff | last post by:
Hello everyone. I've searched through the archives here, and it seems that questions similar to this one have come up in the past, but I was hoping that I could pick your Pythonic brains a bit. Here's a broad overview of what I need to do: cross-platform, client- side GUI apps that interact with a server backed by a database. I'd also like the possibility of having a web interface for small portions of the app. It will be a fairly...
2
2135
by: time_error | last post by:
Please bear with me - I’m quite new to MSSQL and the whole db domain. The db itself is pretty simple. There are approx. 15 tables. The two largest tables’ holds a total of 10 mill. entries. 1) Once or twice a week the CPU on our db server load (powerful quad core) goes berserk. CPU load rises to 95% for a couple of hours and then falls back down to a normal level. Is it possible that an
0
9589
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9423
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10214
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10048
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
9996
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8872
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7410
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
2
3563
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2815
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.