473,394 Members | 1,797 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

Query too slow


Excuse me in advance fo my little English.

I've got this stored procedure

************************************************** **************************
**********

declare @Azienda as varchar(3), @Utente as varchar(20),
@DataDa as datetime, @DataA as datetime,
@AreaDa as varchar(3), @AreaA as varchar(3),
@LineaDa as varchar(3), @LineaA as varchar(3),
@TipoDa as varchar(3), @TipoA as varchar(3),
@FamigliaDa as varchar(3), @FamigliaA as varchar(3),
@ProdottoDa as varchar(20), @ProdottoA as varchar(20),
@AgenteDa as varchar(4), @AgenteA as varchar(4),
@NazioneDa as varchar(50), @NazioneA as varchar(50),
@ZonaDa as Varchar(3), @ZonaA as Varchar(3),
@ProvinciaDa as varchar(2), @ProvinciaA as varchar(2),
@ClienteDa as Varchar(12), @ClienteA as Varchar(12),
@DestinDa as varchar (5), @DestinA as varchar (5),
@TipoDestinDa as varchar(1), @TipoDestinA as varchar(1),
@FlagProdNoTarget as varchar(5),
@GrAcqDa as varchar(10), @GrAcqA as varchar(10),
@TipoCliDa as varchar(3), @TipoCliA as varchar(3),
@SettMercDa as varchar(3), @SettMercA as varchar(3)
Set @Azienda = '900'
Set @Utente = 'Eugenio'
Set @DataDa = '2004-01-01'
Set @DataA = '2004-01-10'
Set @AreaDa = 'UNI'
Set @AreaA = 'UNI'
Set @LineaDa = ''
Set @LineaA = 'ZZZ'
Set @TipoDa = ''
Set @TipoA = 'ZZZ'
Set @FamigliaDa = ''
Set @FamigliaA = 'ZZZ'
Set @ProdottoDa = ''
Set @ProdottoA = 'ZZZZZZZZZZZZZZZZZZZZ'
Set @AgenteDa = ''
Set @AgenteA = 'ZZZZ'
Set @NazioneDa = ''
Set @NazioneA = 'ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ ZZ'
Set @ZonaDa = ''
Set @ZonaA = 'ZZZ'
Set @ProvinciaDa = ''
Set @ProvinciaA = 'ZZ'
Set @ClienteDa = ''
Set @ClienteA = 'ZZZZZZZZZZZZ'
Set @DestinDa = ''
Set @DestinA = 'ZZZZZ'
Set @TipoDestinDa = ''
Set @TipoDestinA = 'Z'
Set @FlagProdNoTarget = 'Vero'
Set @GrAcqDa = ''
Set @GrAcqA = 'ZZZZZZZZZZ'
Set @TipoCliDa = ''
Set @TipoCliA = 'ZZZ'
Set @SettMercDa = ''
Set @SettMercA = 'ZZZ'

Select WSDFR.AreaCommerciale,
WSDFR.Agente,
WSDFR.NazDestin,
WSDFR.ZonaDestin,
WSDFR.ProvDestin,
WSDFR.Cliente,
WSDFR.DescrCliente,
WSDFR.GruppoAcq,
WSDFR.TipoCli,
WSDFR.SettMerc,
WSDFR.CDestin,
WSDFR.DescrDestin,
WSDFR.TipoDestin,
WSDFR.EsclStatis,
WSDFR.EsclTarget,
WSDFR.ValoreNetto,
WSDFR.TpDocum,
WSDFR.VCambioITL,
WSDFR.VCambioEUR,
WSDFR.MeseFatt,
WSDFR.Posizione

From W_St_DocFatt_Righe WSDFR
inner join UniP_Prodotti UPP on
WSDFR.prodotto=UPP.Cod
Where WSDFR.Dtdocum between @DataDa and @DataA and
WSDFR.AreaCommerciale between @AreaDa and @AreaA and
WSDFR.LineaProdotto between @LineaDa and @LineaA and
WSDFR.TipoProdotto between @TipoDa and @TipoA and
WSDFR.FamigliaProdotto between @FamigliaDa and @FamigliaA and
WSDFR.Prodotto between @ProdottoDa and @ProdottoA and
WSDFR.Agente between @AgenteDa and @AgenteA
************************************************** **************************
************************

"W_St_DocFatt_Righe" is a view.

This query run on my SQL7 server and it takes about 10 seconds.
This query exists on another SQL7 server and until last week it took about
10 seconds.
The configuration of both servers are same. Only the hardware is different.

Now, on the second server this query takes about 30 minutes to extract the s
ame details, but anybody has changed any details.

If I execute this query without Where, it'll show me the details in 7
seconds.
This query still takes about same time if Where is
Where WSDFR.Dtdocum between @DataDa and @DataA and
WSDFR.AreaCommerciale between @AreaDa and @AreaA and
WSDFR.LineaProdotto between @LineaDa and @LineaA and
--WSDFR.TipoProdotto between @TipoDa and @TipoA and
--WSDFR.FamigliaProdotto between @FamigliaDa and @FamigliaA and
WSDFR.Prodotto between @ProdottoDa and @ProdottoA and
WSDFR.Agente between @AgenteDa and @AgenteA

or

Where WSDFR.Dtdocum between @DataDa and @DataA and
WSDFR.AreaCommerciale between @AreaDa and @AreaA and
--WSDFR.LineaProdotto between @LineaDa and @LineaA and
--WSDFR.TipoProdotto between @TipoDa and @TipoA and
WSDFR.FamigliaProdotto between @FamigliaDa and @FamigliaA and
WSDFR.Prodotto between @ProdottoDa and @ProdottoA and
WSDFR.Agente between @AgenteDa and @AgenteA

or

Where WSDFR.Dtdocum between @DataDa and @DataA and
WSDFR.AreaCommerciale between @AreaDa and @AreaA and
--WSDFR.LineaProdotto between @LineaDa and @LineaA and
--WSDFR.TipoProdotto between @TipoDa and @TipoA and
--WSDFR.FamigliaProdotto between @FamigliaDa and @FamigliaA and
--WSDFR.Prodotto between @ProdottoDa and @ProdottoA and
WSDFR.Agente between @AgenteDa and @AgenteA
It is a real puzzle!

What happen?

Is there someone that had such as problems and have the right solution?

Thanks in advance.
Bye
Eugenio

Jul 20 '05 #1
11 5367
Eugenio (Ci**@Eugenio.it) writes:
This query run on my SQL7 server and it takes about 10 seconds.
This query exists on another SQL7 server and until last week it took about
10 seconds.
The configuration of both servers are same. Only the hardware is
different.

Now, on the second server this query takes about 30 minutes to extract
the s ame details, but anybody has changed any details.

If I execute this query without Where, it'll show me the details in 7
seconds.
This query still takes about same time if Where is


With out knowledge about the underlying tables in the view and their
indexes, I can only answer in general terms.

SQL Server uses a cost-based optimizer. This optimizer evaluates a number
of possible query plans, and estimates which plan will give the best
performance. As basis for its decisions it uses statistics about the
table which holds the distribution of the data in the various columns.
By default these statistics are updated automatically, usually in
conjunction with SQL Server querying the tables.

There are plenty of possibilities for optimizer to go wrong. For instance
if the statistics are somewhat skewed, a small error in the first table
when computing the cost of a certain join order, may be a large error
in the last, and incorrectly lead to the wrong plan. There are also
systematic errors; the optimizer does not know about correlation between
columns, for instance that OrderDate and InvoiceDate tend follow each
other.

Because of this, it can well happen that a query that executed well
yesterday, suddenly executes much slower, because you've execeeded some
threshold which causes SQL Server to pick a bad plan.

The first you should do is to verify that the involved tables have the
same set of indexes in both servers. You could also try UPDATE STATISTICS
WITH FULLSCAN on the problematic server, this may remove inaccuracies in
the statistics. You should also use DBCC SHOWCONTIG on the tables and
observe fragmentation. Use DBCC DBREIDNEX to defragment the tables.

--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #2

"Erland Sommarskog" <so****@algonet.se> ha scritto nel messaggio
news:Xn**********************@127.0.0.1...
Eugenio (Ci**@Eugenio.it) writes:
This query run on my SQL7 server and it takes about 10 seconds.
This query exists on another SQL7 server and until last week it took about 10 seconds.
The configuration of both servers are same. Only the hardware is
different.

Now, on the second server this query takes about 30 minutes to extract
the s ame details, but anybody has changed any details.

If I execute this query without Where, it'll show me the details in 7
seconds.
This query still takes about same time if Where is


With out knowledge about the underlying tables in the view and their
indexes, I can only answer in general terms.

SQL Server uses a cost-based optimizer. This optimizer evaluates a number
of possible query plans, and estimates which plan will give the best
performance. As basis for its decisions it uses statistics about the
table which holds the distribution of the data in the various columns.
By default these statistics are updated automatically, usually in
conjunction with SQL Server querying the tables.

There are plenty of possibilities for optimizer to go wrong. For instance
if the statistics are somewhat skewed, a small error in the first table
when computing the cost of a certain join order, may be a large error
in the last, and incorrectly lead to the wrong plan. There are also
systematic errors; the optimizer does not know about correlation between
columns, for instance that OrderDate and InvoiceDate tend follow each
other.

Because of this, it can well happen that a query that executed well
yesterday, suddenly executes much slower, because you've execeeded some
threshold which causes SQL Server to pick a bad plan.

The first you should do is to verify that the involved tables have the
same set of indexes in both servers. You could also try UPDATE STATISTICS
WITH FULLSCAN on the problematic server, this may remove inaccuracies in
the statistics. You should also use DBCC SHOWCONTIG on the tables and
observe fragmentation. Use DBCC DBREIDNEX to defragment the tables.

--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp

Thanks Erland
I'll try it immediatly.
Bye
Eugenio
Jul 20 '05 #3
>
With out knowledge about the underlying tables in the view and their
indexes, I can only answer in general terms.

SQL Server uses a cost-based optimizer. This optimizer evaluates a number
of possible query plans, and estimates which plan will give the best
performance. As basis for its decisions it uses statistics about the
table which holds the distribution of the data in the various columns.
By default these statistics are updated automatically, usually in
conjunction with SQL Server querying the tables.

There are plenty of possibilities for optimizer to go wrong. For instance
if the statistics are somewhat skewed, a small error in the first table
when computing the cost of a certain join order, may be a large error
in the last, and incorrectly lead to the wrong plan. There are also
systematic errors; the optimizer does not know about correlation between
columns, for instance that OrderDate and InvoiceDate tend follow each
other.

Because of this, it can well happen that a query that executed well
yesterday, suddenly executes much slower, because you've execeeded some
threshold which causes SQL Server to pick a bad plan.

The first you should do is to verify that the involved tables have the
same set of indexes in both servers. You could also try UPDATE STATISTICS
WITH FULLSCAN on the problematic server, this may remove inaccuracies in
the statistics. You should also use DBCC SHOWCONTIG on the tables and
observe fragmentation. Use DBCC DBREIDNEX to defragment the tables.

--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp


I did that, but it's changed nothing.
Can you give me others suggestion about this.
Thanks a lot
Eugenio

Jul 20 '05 #4
Eugenio (Ci**@Eugenio.it) writes:
I did that, but it's changed nothing.
Can you give me others suggestion about this.


No. Not without more information. Please post the following:

o CREATE TABLE and CREATE INDEX statements for the involved tables.
o The output when you run the procedures with SHOW STATISTICS PROFILE ON
on *both* servers. Since the output of the query plan is very wide,
please but this output in an attachment. (In a text file.)
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #5
On Tue, 25 May 2004 09:01:31 +0200, Eugenio wrote:
I did that, but it's changed nothing.
Can you give me others suggestion about this.


Recompile the view so that the stored plan will take into account the
updated statistics.

Use the Query Plan Optimizer in Query Analyzer to get suggestions about
indexes to add.

Shouldn't all those Set @var... statments be in a single SELECT statement?

Does it make sense to have a
WHERE ... WSDFR.AreaCommerciale between @AreaDa and @AreaA
subclause, when your set statement sets them to the same value? Why not
WHERE ... WSDFR.AreaCommerciale = 'UNI'
?
Jul 20 '05 #6

"Ross Presser" <rp******@imtek.com> ha scritto nel messaggio
news:gr***************************@40tude.net...
On Tue, 25 May 2004 09:01:31 +0200, Eugenio wrote:
I did that, but it's changed nothing.
Can you give me others suggestion about this.

Recompile the view so that the stored plan will take into account the
updated statistics.

Use the Query Plan Optimizer in Query Analyzer to get suggestions about
indexes to add.


I did that, but it's changed nothing

Shouldn't all those Set @var... statments be in a single SELECT statement?

Does it make sense to have a
WHERE ... WSDFR.AreaCommerciale between @AreaDa and @AreaA
subclause, when your set statement sets them to the same value? Why not
WHERE ... WSDFR.AreaCommerciale = 'UNI'
?


No, because the first is a part of a stored procedure. It's the query who
doesn't work.
I extracted it to try with Query Analyzer.
An Access programm calls this SP giving a several parameters.

Thanks

Eugenio
Jul 20 '05 #7
I did that, but it's changed nothing.
Can you give me others suggestion about this.


No. Not without more information. Please post the following:

o CREATE TABLE and CREATE INDEX statements for the involved tables.
o The output when you run the procedures with SHOW STATISTICS PROFILE ON
on *both* servers. Since the output of the query plan is very wide,
please but this output in an attachment. (In a text file.)


Hi, Erland

I send at your mailbox the text files you've asked me, because I can't sent
you a reply with an attachment into the newsgroup. I don't know why, but
it's impossibile.

Thanks
Eugenio

Jul 20 '05 #8
Eugenio (Ci**@Eugenio.it) writes:
I send at your mailbox the text files you've asked me, because I can't
sent you a reply with an attachment into the newsgroup. I don't know
why, but it's impossibile.


Got the mail. Since the tables and the plans are complex, I may take a
few days before I get the occasion to look at it. In the mean while,
if you cannot post it as an attachment, you could put it on a web site
and post a URL, in case someone else is interested in taking a stab at it.
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #9

..

Got the mail. Since the tables and the plans are complex, I may take a
few days before I get the occasion to look at it. In the mean while,
if you cannot post it as an attachment, you could put it on a web site
and post a URL, in case someone else is interested in taking a stab at it.


These are the URL where you can find my text files.

Tables and Views structure http://www.unicars.it/eugenio/Tables&Views.sql

Statistic profile of server where the query works
http://www.unicars.it/eugenio/SpeedyStatProf

Statistic profile of server where the query doesn't work
http://www.unicars.it/eugenio/SpeedyStatProf
Thanks
Eugenio
Jul 20 '05 #10
Eugenio (Ci**@Eugenio.it) writes:
I send at your mailbox the text files you've asked me, because I can't
sent you a reply with an attachment into the newsgroup. I don't know
why, but it's impossibile.


I had a look at your tables, and I am afraid that your set of tables and
views is far too huge for me to try to dig into it all. But I can share
some observations. Before I go on, I like to stress that there is a limit
on how much help you get in the newsgroups. Also, help here is based on
the idea that you do some of the work yourself. You don't always get the
answer - sometimes you only get the information so that you can find out
on your own. If you are not prepared to investigate on your own, you are
not likely to get your issue sorted out. This does not the least apply in
a complex case like this one.

1) The query you posted was this one (abbreviated)

declare @Azienda as varchar(3), @Utente as varchar(20),
@DataDa as datetime, @DataA as datetime,
...

Set @Azienda = '900'
Set @Utente = 'Eugenio'
Set @DataDa = '2004-01-01'
Set @DataA = '2004-01-10'
Set @AreaDa = 'UNI'
Set @AreaA = 'UNI'
Set @LineaDa = ''
Set @LineaA = 'ZZZ'
Set @TipoDa = ''
...

Select WSDFR.AreaCommerciale,
WSDFR.Agente,
WSDFR.NazDestin,
...
From W_St_DocFatt_Righe WSDFR
inner join UniP_Prodotti UPP on
WSDFR.prodotto=UPP.Cod
Where WSDFR.Dtdocum between @DataDa and @DataA and
WSDFR.AreaCommerciale between @AreaDa and @AreaA and
WSDFR.LineaProdotto between @LineaDa and @LineaA and
WSDFR.TipoProdotto between @TipoDa and @TipoA and
WSDFR.FamigliaProdotto between @FamigliaDa and @FamigliaA and
WSDFR.Prodotto between @ProdottoDa and @ProdottoA and
WSDFR.Agente between @AgenteDa and @AgenteA

It was not clear to me, if this is the actual query you run in
production, or a simplification of the stored procedure, and the
variables are parameters in real life. This has very big importance,
because when SQL Server builds a query plan it uses the values of
the input parameters as a hint, but it is completely blind to the values
of variables and assumes default values. This means that with variables,
you will get the same plan, no matter what you put into the variables.
If you are using parameters, there is a chance that adding WITH RECOMPILE
to the procedure definition that you get the best plan each time.

2) It seems that you are doing some kind of dynamic search where there
the user can specify many different conditions. This is a tricky
case to handle, and often you get best performance with dynamic SQL.
However, dynamic SQL requires the users to have direct permissions
on the involved views and tables, so it may not be an acceptable
solution. In any case, on http://www.sommarskog.se/dyn-search.html,
I have an article which discusses this topic in detail.

3) When looking at your data model, it appears unfinished. There are
some foreign keys defined, but judging from the queries there is for
instance a foreign key from P_FatturaT.Tipodoc to TB_TipologiaDoc.Tipo.
Adding foreign keys may not add performance in itself, but a thorough
study of the data model, may lead to a more precise and effective
data model.

4) The inner view, W_St_DocFatt_TestDoc, has a GROUP BY which only serves
as a DISTINCT. Maybe it is needed, maybe it is not. Since the key
information was incomplete I could not deduce that. But there certainly
is a cost for getting the distinct values, so if you can get than one
out, you may win a lot. Also, thanks to the GROUP BY, you cannot make
this view an indexed view, which could increase the speed of the
query radically.

5) None of the tables has a clustered index. Is this really a concious
decision? In such case, I think you should review it. If you would
change the indexes for the most common search indexes, that could
do wonders to performance, not the least with dynamic SQL.

6) There is a lot of NOLOCK. This may be good for performance, but it
is not good for correctness. If there is indeed traffic going on
in the database when you run your query, you can get incorrect or
inconsistent results back. If you make your view indexed, you should
remove the locking hints, because it's not clear what they would mean.

I also like to repeat what I said in my original posting:

SQL Server uses a cost-based optimizer. This optimizer evaluates a number
of possible query plans, and estimates which plan will give the best
performance. As basis for its decisions it uses statistics about the
table which holds the distribution of the data in the various columns.
By default these statistics are updated automatically, usually in
conjunction with SQL Server querying the tables.

There are plenty of possibilities for optimizer to go wrong. For instance
if the statistics are somewhat skewed, a small error in the first table
when computing the cost of a certain join order, may be a large error
in the last, and incorrectly lead to the wrong plan. There are also
systematic errors; the optimizer does not know about correlation between
columns, for instance that OrderDate and InvoiceDate tend follow each
other.

Because of this, it can well happen that a query that executed well
yesterday, suddenly executes much slower, because you've execeeded some
threshold which causes SQL Server to pick a bad plan.

I repeat this, because given the very complex nature of your query,
there are very many ways for the optimizer to go astray.

I hope these observations should give you some input for your
further investigations.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #11
Many thanks for so detailed answer.

Regards
Eugenio
Jul 20 '05 #12

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Tim Fountain | last post by:
We've recently enabled slow query logging on a server and it's proving interesting seeing which queries are bogging things down. This one is puzzling me a little: SELECT articleid, type,...
5
by: Jason | last post by:
The following stored procedure is taking too long (in my opinion). The problem seems to be the SUM line. When commented out the query takes a second or two. When included the response time climbs...
1
by: Gary Wales | last post by:
We have two main query types running against a table of some 2 million rows and have gotten query response down to well under a second by using the right indexes. Problem is that we are running...
1
by: Thomas Bartkus | last post by:
Is it possible that the elapsed time being measured includes waiting for the client to acknowledge that it has received all the data? In *addition* to the server execution time? Documentation...
7
by: Thomi Baechler | last post by:
Hello Everybody I run the following query against to identical databases. Execution time on the first DB is 0 seconds, on the other 6 seconds! SELECT dbo.HRMABZ.EMPKEY ,...
2
by: AG | last post by:
I am able to run a query which runs FAst in QA but slow in the application.It takes about 16 m in QA but 1000 ms on the Application.What I wanted to know is why would the query take a long time...
3
by: Janross | last post by:
I'm having trouble with a query that's prohibitively slow. On my free-standing office computer it's fine (well, 2-4 seconds), but on the client's network, it takes at least 5 minutes to run. ...
4
by: cefrancke | last post by:
Are there any ways to speed up a Cumulative sum in a query or perhaps another faster way to have a cumulative sum column (call a vba function?). For example, I want to sum up all values under...
0
by: Dave Hammond | last post by:
Hi All, I'm trying to use the slow-query-log (with --log-queries-not-using-indexes enabled) to determine if any queries need optimization, and have a few questions about some entries I'm...
2
by: existential.philosophy | last post by:
This is a new problem for me: I have some queries that open very slowly in design view. My benchmark query takes about 20 minutes to open in design view. That same query takes about 20 minutes...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.