473,657 Members | 2,559 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Statistics IO logical reads sometimes 15 million and other times down to 90 thousand?

I am running a query in SQL 2000 SP4, Windows 2000 Server
that is not being shared with any other users or any sql
connections users. The db involves a lot of tables,
JOINs, LEFT JOINs, UNIONS etc... Ok it's not a pretty
code and my job is to make it better.

But for now one thing I would like to understand with your
help is why the same SP on the same server and everything
the same without me changing anything at all in terms of
SQL Server (configuration, code change, ...) runs in
Query Analyzer in 1:05 minute and i see one table get a
hit of 15 million logical reads:
Table 'TABLE1'. Scan count 2070, logical reads 15516368,
physical reads 147, read-ahead reads 0.

This 'TABLE1' has about 400,000 records

The second time i ran right after in Query Analyzer again:
Table 'TABLE1'. Scan count 2070, logical reads 15516368,
physical reads 0, read-ahead reads 0.
I can see now the physical reads being 0 as it is
understandable that SQL is now fetching the data from
memory.

But now the third time I ran:
Table 'TABLE1'. Scan count 28, logical reads 87784,
physical reads 0, read-ahead reads 0.
The Scan count went down from 2070 to 28. I don't
know what the Scan count is actually. It scanned the
table 28 times?
The logical reads went down to 87,784 reads from 15
million and 2 seconds execution time!

Anybody has any ideas why this number change?

The problem is i tried various repeats of my test, i
rebooted the SQL Server, dropped the database, restored
it, ran the same exact query and it took 3-4-5 seconds
with 87,784 reads vs 15 million.

Why i don't see 15 million now?

Well i kept working during the day and i happen to run into
another set of seeing 15 million again. A few runs would
keep running at the paste of 15 million over 1 minute and
eventually the numbers went back down to 87,784 and 2
seconds.

Is it my way of using the computer? Maybe i was opening
too many applications, SQL was fighting for memory?
Would that explain the 15 million reads?
I went and changed my SQL Server to used a fixed memory
of 100 megs, restarted it and tested again the same
query but it continued to show 87,784 reads with 2 seconds
execution time.

I opened all kinds of applications redid the same test
and i was never able to see 15 million reads again.

Can someone help me with suggestions on what could be
this problem and what if i could find a way to come to
see 15 million reads again?

By the way with the limited info you have here about the
database I am using, is 87,784 reads a terrible number of
reads, average or normal when the max records in the many
tables involved in this SP is 400,000 records?

I am guessing it is a terrible number, am I correct?
I would appreciate your help.

Thank you

Mar 4 '06 #1
4 3549
I can add that I have 1.5GB of RAM
Everything is on the local server and I think 13 rows
are being returned.
Mar 4 '06 #2
serge (se****@nospam. ehmail.com) writes:
I can see now the physical reads being 0 as it is
understandable that SQL is now fetching the data from
memory.

But now the third time I ran:
Table 'TABLE1'. Scan count 28, logical reads 87784,
physical reads 0, read-ahead reads 0.

The Scan count went down from 2070 to 28. I don't
know what the Scan count is actually. It scanned the
table 28 times?
Or at least accessed. Say that you run a query like:

SELECT SUM(OD.UnitPric e * OD.Quantity)
FROM Orders O
JOIN [Order Details] OD ON O.OrderID = OD.OrderID
WHERE O.CustomerID = N'VINET'

This gives the following statistics IO:

Table 'Order Details'. Scan count 5, logical reads 10, physical reads 0,
read-ahead reads 0.
Table 'Orders'. Scan count 1, logical reads 2, physical reads 0,
read-ahead reads 0.

There are five orders for VINET in Northwind..Orde rs. SQL Server seeks
the index on Orders.Custerom erID and for every match it looks up the
order in Order Details. Not by scan, but by seeking the index. But
the output from STATISTICS IO does not make that distinction.

Note that is what happens with nested-loop joins. Had the join been
implemented as a merge or a hash join, the Scan Count would be 1
for both table - but then it had also truely been a scan.
The logical reads went down to 87,784 reads from 15
million and 2 seconds execution time!

Anybody has any ideas why this number change?
My guess is auto-statistcs. When you run queries than scan a lot of
data, SQL Server takes the oppurtunity to sample statistics abou the
data. Once that statistics is available, the optimizer may find a better
way to implement the query.

If you were to create a new table, and move the data over to that
table, my prediction that you will see the same result. First 15
million reads a few times, and then a reduction to 87000 reads. If you
also study the query plan, you will see that it changes.
Is it my way of using the computer? Maybe i was opening
too many applications, SQL was fighting for memory?


No, that has nothing to do with it.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Mar 4 '06 #3
Thanks Erland, I did learn a few more things today after trying the
things you said. Plus I also remembered one of my other posts where
you suggested to compare the execution plan in text format.
This is something I'll be doing while I continue my understanding
of the execution plan in my query.
If you were to create a new table, and move the data over to that
table, my prediction that you will see the same result. First 15
million reads a few times, and then a reduction to 87000 reads. If you
also study the query plan, you will see that it changes.
I did test this but I started getting reads of 100,000 instead of 15
million.
But I can't say I did a clean job. I did a

SELECT *
INTO TABLE2
FROM TABLE1

Then dropped the Foreign Keys of TABLE1, dropped TABLE1,
renamed TABLE2 to TABLE1.

Then i re-run my query and i got 100,000 reads.

Anyhow what I found after is like you said statistics related.
My guess is auto-statistcs. When you run queries than scan a lot of
data, SQL Server takes the oppurtunity to sample statistics abou the
data. Once that statistics is available, the optimizer may find a better
way to implement the query.


This time I restored the same db on my second SQL instance on my
machine. I ran the query dozens of time and kept getting 15 million reads
in 1+ minute.
I left the machine idle for a few hours, returned back, re-ran the query
and same 15 million reads...
Immediately I ran
UPDATE STATISTICS dbo.TABLE1 WITH FULLSCAN
and i re-ran the query and it took 2 seconds and 87 thousand reads!

Ok I can tell you that yesterday on my first SQL instance I ran the
Database Maintenance Plan wizard and chose to update the stats using
50% sample data. And that explains why my queries were running in 2
seconds. But I still can't understand why my queries kept running in 2
seconds
if i was dropping the database and restoring it brand new! I thought the
database
stats info would be stored inside the database. It's almost like either the
stats
info, or the execution plan???? maybe is being stored in the master
databases
of my SQL Server??
I just did another test on my second SQL Server Instance. I dropped the db,
restored it again, ran my query, it took 10 seconds to execute, i ran it
again
and it took 2 seconds and 87 thousand reads. So it looks to me something
is being stored in the master database of my SQL Server Instance otherwise
why it is not taking 15 million reads anymore?

Thank you


Mar 5 '06 #4
Even when I restored the same db using a different db name and
different physical file names, I run my query, the first time it takes
8 seconds then 2, all with no 15 million reads!

The only common thing would be the logical file name which
I did not change everytime I restored this same db.

It's almost like everytime we restore a database, we should
immediately right after re-index it and update all the statistics?

I am sure now if I restore this same db on another SQL Server
Instance, I will keep getting 15 million reads until I update
the statistics of 1 single table. Then that SQL Server Instance
will never take 15 million reads if i drop the db and restore
it again. I don't know why.

Mar 5 '06 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

13
1870
by: Nickolay Kolev | last post by:
Hi all, I am currently writing some simple functions in the process of learning Python. I have a task where the program has to read in a text file and display some statistics about the tokens in that file. The text I have been feeding it is Dickens' David Copperfield. It is really simple - it reads the file in memory, splits it on whitespace, strips punctuation characters and transforms all remaining
7
2883
by: Charles Crume | last post by:
Hello all; I have used dBASE, and other computer languages/databases, for years. They all have a logical field type. However, the version of MySQL used by the ISP hosting my site does not support a "logical" field type. It does support ENUM and I have set some up in a couple of tables that accept the values 'T' and 'F'. Sometimes they work like a logical field: if ($myrow) echo 'New';
5
6630
by: Jesper Jensen | last post by:
Hello group. I have an issue, which has bothered me for a while now: I'm wondering why the column statistics, which SQL Server wants me to create, if I turn off auto-created statistics, are so important to the optimizer? Example: from Northwind (with auto create stats off), I do the following:
4
2689
by: Sky Fly | last post by:
Hello all, I've written a stored procedure which runs OK for the first few thousand records it processes, then around about the 10,000th record it suffers a sudden and dramatic drop in performance (from about 40 records per second to about 1 per second). I've found that when this happens, if I run an UPDATE STATISTICS query on the affected tables, performance picks up again,
6
1293
by: Dung Ping | last post by:
A while ago I posted a message saying that saving Javascript code as external file might hide the code better. All replies to it said it would still be very easy for people to look at the source code. I sometimes look at the source code if a page is impressive or interesting, but have never opened any external file. As experiment, I saved a Javascript code of a web page as the external file of it. After several days, I looked at the...
17
5066
by: romixnews | last post by:
Hi, I'm facing the problem of analyzing a memory allocation dynamic and object creation dynamics of a very big C++ application with a goal of optimizing its performance and eventually also identifying memory leaks. The application in question is the Mozilla Web Browser. I also have had similar tasks before in the compiler construction area. And it is easy to come up with many more examples, where such kind of statistics can be very...
2
5538
by: dbtwo | last post by:
Until today I always thought as long as you see a lot of logical reads as compared to physical reads, then you're good. But it looks like it isn't so. But doesn't logical read mean it's being read from memory and no I/O involved? So why is Logical Reads = CPU Consumption ? I ran into an exact scenario last week when our applciation were running something, and each time an application started, the CPU would go from 99% idle to 48% idle. I...
21
2440
by: omkar pangarkar | last post by:
Hi all, I have two simple hello world programs one using printf() and other using write() --prog 1-- #include<stdio.h> #include<stdlib.h> int main() { printf("Hello"); /* up to here write() isn't called, if u * give \n here then two write()s will
0
5954
ADezii
by: ADezii | last post by:
In last week's Tip, I showed you how to use the ISAMStats Method of the DBEngine (DAO) to return vital statistics concerning Query executions such as: Disk Reads and Writes, Cache Reads and Writes, and Locks placed and released. As promised, in this week's Tip I'll demonstrate how to accomplish parallel functionality within the context of ADO using the OpenSchema Method of the Connection Object. We indicate to the OpenSchema Method that we...
0
8392
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
8825
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
8732
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
8503
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
7324
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
6163
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5632
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
2
1953
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
2
1611
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.