473,651 Members | 2,525 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

UPDATE STATISTICS necessary to improve performance (?)

Dear Sql Server experts:

First off, I am no sql server expert :)

A few months ago I put a database into a production environment.
Recently, It was brought to my attention that a particular query that
executed quite quickly in our dev environment was painfully slow in
production. I analyzed the the plan on the production server (it
looked good), and then tried quite a few tips that I'd gleaned from
reading newsgroups. Nothing worked. Then on a whim I performed an
UPDATE STATISTICS on a few of the tables that were being queried. The
query immediately went from executing in 61 seconds to under 1 second.
I checked to make sure that statistics were being "auto updated" and
they were.

Why did I need to run UPDATE STATISTICS? Will I need to again?

A little more background info:
The database started empty, and has grown quite rapidly in the last
few months. One particular table grows at a rate of about 300,000
records per month. I get fast query times due to a few well placed
indexes.

A quick question:
If I add an index, do statistics get automatically updated for this
new index immediately?

Thanks in advance for any help,

Felix
Jul 20 '05 #1
17 14069
In article <80************ **************@ posting.google. com>,
fe************* ************@ya hoo.com says...
A few months ago I put a database into a production environment.
Recently, It was brought to my attention that a particular query that
executed quite quickly in our dev environment was painfully slow in
production. I analyzed the the plan on the production server (it
looked good), and then tried quite a few tips that I'd gleaned from
reading newsgroups. Nothing worked. Then on a whim I performed an
UPDATE STATISTICS on a few of the tables that were being queried. The
query immediately went from executing in 61 seconds to under 1 second.
I checked to make sure that statistics were being "auto updated" and
they were.


I've seen this type of thing many times, developers are not exposed to
the production system long enough to build a maintenance plan, and the
DBA doesn't get enough time to monitor the DB to build a maintenance
plan specific to the database.

In most cases, I create generic maintenance plans that will auto select
20% of the tables per night and reindex them, same with marking stored
procedures for recompile.

If you do something like this, or if you just reindex/recompile them all
on a weekend you should be able to maintain your performance.

Just so you know, this is a problem in Oracle too - had a team of
developers build a inventory system, worked great for almost a year, but
no one really noticed it getting slower every day until the reports
started failing. They tried for three days to fix it before calling me.
My first clue was no schedule maintenance plan, second was that it
worked for almost a year, and then that there are about 20K inserts in a
single table per day with no deletes.... Reindexed two tables and it
returned to the same level of performance as when it was developed.

Also, when you mark a sproc for recompile, the first time it executes it
will be SLOW. While tables are reindexing they will also be slow or
could even be locked, so you want to do this in the off-hours.

--
--
sp*********@rro hio.com
(Remove 999 to reply to me)
Jul 20 '05 #2
Felix (fe************ *************@y ahoo.com) writes:
A few months ago I put a database into a production environment.
Recently, It was brought to my attention that a particular query that
executed quite quickly in our dev environment was painfully slow in
production. I analyzed the the plan on the production server (it
looked good), and then tried quite a few tips that I'd gleaned from
reading newsgroups. Nothing worked. Then on a whim I performed an
UPDATE STATISTICS on a few of the tables that were being queried. The
query immediately went from executing in 61 seconds to under 1 second.
I checked to make sure that statistics were being "auto updated" and
they were.

Why did I need to run UPDATE STATISTICS? Will I need to again?
If you only performed UPDATE STATISTICS it seems a little funny, if
there is autostatistics on. If you added WITH FULLSCAN, then there
is a possible explanation.

Whether you will need to do it again, I cannot tell, but as Leythos
discusses, the most critical part is when you start empty and volume
grows. Once you are over a certain level a lot of execution plans
becomes moot.

It the query was parameterized - including auto-parameterized or in
a stored procedure - there is another possible explanation, that the
cached plan was for an atypical parameter. Recall that when SQL Server
builds a query plan for a procedure, it uses the current value of
the parameters to build the plan. If the value used happens to be an
atypical one, then you may get a bad plan for common values to linger.

If this is the case, an sp_recompile on a table involved in the query
will do.
A quick question:
If I add an index, do statistics get automatically updated for this
new index immediately?


Yes, as I recall, SQL Server creates statistics when it builds an
index.
--
Erland Sommarskog, SQL Server MVP, so****@algonet. se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3
The query was not parameterized, nor in a stored procedure.

And I did not run UPDATE STATISTICS WITH FULLSCAN.

What do you mean by "Once you are over a certain level a lot of
execution plans becomes moot."?

Anyway, I remain puzzled.

Thanks,
Felix
Erland Sommarskog <so****@algonet .se> wrote in message news:<Xn******* *************** @127.0.0.1>...
Felix (fe************ *************@y ahoo.com) writes:
A few months ago I put a database into a production environment.
Recently, It was brought to my attention that a particular query that
executed quite quickly in our dev environment was painfully slow in
production. I analyzed the the plan on the production server (it
looked good), and then tried quite a few tips that I'd gleaned from
reading newsgroups. Nothing worked. Then on a whim I performed an
UPDATE STATISTICS on a few of the tables that were being queried. The
query immediately went from executing in 61 seconds to under 1 second.
I checked to make sure that statistics were being "auto updated" and
they were.

Why did I need to run UPDATE STATISTICS? Will I need to again?


If you only performed UPDATE STATISTICS it seems a little funny, if
there is autostatistics on. If you added WITH FULLSCAN, then there
is a possible explanation.

Whether you will need to do it again, I cannot tell, but as Leythos
discusses, the most critical part is when you start empty and volume
grows. Once you are over a certain level a lot of execution plans
becomes moot.

It the query was parameterized - including auto-parameterized or in
a stored procedure - there is another possible explanation, that the
cached plan was for an atypical parameter. Recall that when SQL Server
builds a query plan for a procedure, it uses the current value of
the parameters to build the plan. If the value used happens to be an
atypical one, then you may get a bad plan for common values to linger.

If this is the case, an sp_recompile on a table involved in the query
will do.
A quick question:
If I add an index, do statistics get automatically updated for this
new index immediately?


Yes, as I recall, SQL Server creates statistics when it builds an
index.

Jul 20 '05 #4
Felix (fe************ *************@y ahoo.com) writes:
The query was not parameterized, nor in a stored procedure.
SQL Server can do autoparameteriz ation, although I believe this
mainly happen for simple queries.
What do you mean by "Once you are over a certain level a lot of
execution plans becomes moot."?


Say that you have a query which involves two tables. One table is a
Products table, and one is an Orders table. When you start the system,
you have a lot of products, but you have no orders. But after running
the system for six months, you have a tenfold more orders than products,
and this ratio is only going to increase. Thus, all plans that are built
on the assumption that the Orders table is smaller are no longer of
interest.

--
Erland Sommarskog, SQL Server MVP, so****@algonet. se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #5
Erland Sommarskog <so****@algonet .se> wrote in message news:<Xn******* *************** @127.0.0.1>...
Felix (fe************ *************@y ahoo.com) writes:
The query was not parameterized, nor in a stored procedure.


SQL Server can do autoparameteriz ation, although I believe this
mainly happen for simple queries.
What do you mean by "Once you are over a certain level a lot of
execution plans becomes moot."?


Say that you have a query which involves two tables. One table is a
Products table, and one is an Orders table. When you start the system,
you have a lot of products, but you have no orders. But after running
the system for six months, you have a tenfold more orders than products,
and this ratio is only going to increase. Thus, all plans that are built
on the assumption that the Orders table is smaller are no longer of
interest.


Ah, thanks. That makes a lot of sense. But since the stats are being
auto updated (in the default case and in my case), this should not be
a problem, right?
Jul 20 '05 #6
Felix (fe************ *************@y ahoo.com) writes:
Ah, thanks. That makes a lot of sense. But since the stats are being
auto updated (in the default case and in my case), this should not be
a problem, right?


Right. For this typical case that I outlined, SQL Server handles the case,
and you are not likely to see any problems.

But then there might be more sensitive cases. Say that you have a complex
query with many tables involved. A minor error in the statistics of one of
the innermost tables in the query, can lead to gross errors in the
estimates. And, of course, there are cases where even with completely
accurate statistics the optimizer will go astray.
--
Erland Sommarskog, SQL Server MVP, so****@algonet. se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #7
In article <80************ *************@p osting.google.c om>, felix666007
_n************* *@yahoo.com says...
Ah, thanks. That makes a lot of sense. But since the stats are being
auto updated (in the default case and in my case), this should not be
a problem, right?


A good example of AUTO for things is the standard MS Drive Fragmentation
- if you leave it to the OS your drive will be fragmented, same if using
DiskKeeper, there will still be some fragmentation. If you take it
offline you can fully defragment the drive.

This same thought holds try for stats - I've seen hundreds of servers
that benefited from having tables manually reindexed, stored procs
recompiled, etc....

While the automation works well, it leaves a lot to be desired. You can
write your own scripts to automate the process of the reindexing,
recompiling, etc... and then schedule them on a nightly basis (do about
20% of the objects each evening if possible).

In one system, where they insert about 500,000 new records a day, the
system was overly slow, a nightly reindex on the common insert tables
brought back performance to the level of the development days.
--
--
sp*********@rro hio.com
(Remove 999 to reply to me)
Jul 20 '05 #8
Erland Sommarskog (so****@algonet .se) writes:
Felix (fe************ *************@y ahoo.com) writes:
Ah, thanks. That makes a lot of sense. But since the stats are being
auto updated (in the default case and in my case), this should not be
a problem, right?


Right. For this typical case that I outlined, SQL Server handles the case,
and you are not likely to see any problems.

But then there might be more sensitive cases. Say that you have a complex
query with many tables involved. A minor error in the statistics of one of
the innermost tables in the query, can lead to gross errors in the
estimates. And, of course, there are cases where even with completely
accurate statistics the optimizer will go astray.


And, oh, just because the statistics are updated, does not mean that the
query plan is. I had a case recently where this happened. It's not really
a usuaul situation, because I'm running a long job that converts data
from a competitor's system to our system. I'm running eight years worth
of transaction in one long stored procedure which runs for days. The
job runs by handling each business day there has been over the years,
one at a time. Anyway, there was one procedure in the job, which for a
long time ran with nothing to do, as there was no data to convert, but
from one day there is data every day. By time the procedure went from
a few seconds, to half a minute. With an sp_recompile on the table that
this procedure loads, the procedure quickly fell back to a few seconds.
--
Erland Sommarskog, SQL Server MVP, so****@algonet. se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #9
Leythos (vo**@nowhere.c om) writes:
This same thought holds try for stats - I've seen hundreds of servers
that benefited from having tables manually reindexed, stored procs
recompiled, etc....


But reindexing is another thing. There is no autoreindex, so of course
running reindex manually may have some effect.
--
Erland Sommarskog, SQL Server MVP, so****@algonet. se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
10436
by: Mark A Framness | last post by:
Greetings, I am working on a project and we need to write a conversion script to initialize a new field on a table. The number of records on this table is on the order of millions so routine selection and update takes a long time. I am tasked with writing a pl/sql proc that utilizes array processing to update the column.
1
6155
by: Gent | last post by:
am using FOR UPDATE triggers to audit a table that has 67 fields. My problem is that this slows down the system significantly. I have narrowed down the problem to the size (Lines of code) that need to be compiled after the trigger has been fired. There is about 67 IF Update(fieldName) inside the trigger and a not very complex select statement inside the if followed by an insert to the audit table. When I leave only a few IF-s in the...
2
2517
by: serge | last post by:
/* This is a long post. You can paste the whole message in the SQL Query Analyzer. I have a scenario where there are records with values pointing to wrong records and I need to fix them using an Update statement. I have a sample code to reproduce my problem. To simplify the scenario I am trying to use Order related tables to explain a little better the tables i have to work with.
1
2506
by: hrhoe | last post by:
Hi, I created a C# program that update SQL Server table row by row. I referenced third party dll file to do the necessary modification for each row. And I used foreach loop to update data in the table. The problem is the poor performance. If I use the third party program, I can process 10,000 records within a minute, but if I try with my program,
1
2308
by: Chris Weston | last post by:
Hi. I have automatic statistic update turned on for all my databases. Is this an overhead I can do without? Could I update them overnight when the database is hardly in use? Thanks -- Chris Weston
17
5065
by: romixnews | last post by:
Hi, I'm facing the problem of analyzing a memory allocation dynamic and object creation dynamics of a very big C++ application with a goal of optimizing its performance and eventually also identifying memory leaks. The application in question is the Mozilla Web Browser. I also have had similar tasks before in the compiler construction area. And it is easy to come up with many more examples, where such kind of statistics can be very...
4
1655
by: CMOS | last post by:
hi, does any one know a place to get performance statistics for stl containers? CMOS
11
6059
by: SAL | last post by:
Hello, I have a Gridview control (.net 2.0) that I'm having trouble getting the Update button to fire any kind of event or preforming the update. The datatable is based on a join so I don't know if that's what's causing the behavior or not. It seems like the Update button should at least do something. When the Edit button is clicked, the grid goes into Edit mode and the Cancel button takes the grid out of Edit mode. So, I don't get what...
3
3950
by: Michel Esber | last post by:
Hi all, DB2 V8 LUW FP 15 There is a table T (ID varchar (24), ABC timestamp). ID is PK. Our application needs to frequently update T with a new value for ABC. update T set ABC=? where ID = ?
0
8347
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
8275
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
8694
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
8457
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8571
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
7294
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
4143
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
4280
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
1585
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.