By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,968 Members | 1,691 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,968 IT Pros & Developers. It's quick & easy.

Optimal configuration for report generator

P: n/a
I am working with a report generator that is based on SQL Server 2000 and
uses ASP as the UI. Basically we have a set of reports that end users can
execute through a web browser. In general the model works fine, but we are
running into some scaling issues.

What I'm trying to determine is, what is the optimal configuration for this
system. It is currently a 2.4G Pentium with a large RAID and 1G of RAM. We
have been using the "fixed" memory configuration, allocating 864M to SQL.
This is on a Windows 2003 server box.

This works fine when a "small" query or two is executed, but the performance
suffers terribly when several users try to run reports in parallel. A single
query might take 10 minutes to run if nothing else is happening on the box,
but if additional users log on an run reports, it's almost impossible to
predict when the queries will finish.

I am also looking at the effect of database size on performance, running
tests against a database with 1 month, 3 months, and say 12 months of data,
running the same query against 2 databases in parallel. With the original
configuration, the results were all over the place, with the 12 month
database outperforming the smaller dbs, while other times there was little
difference. It seems that once the system starts paging, and paging heavily,
it's over; the system never "recovers" and queries that previously ran in a
few minutes now take hours.

I added 3 G more memory to the system, and modified boot.ini to include the
/3GB switch. Now when I run the same tests, the results are much more
consistent, as the system rarely ever has to swap. Then again I've never
seen it go past 1.7G in Task manager, making me think that any more than say
2.5G of memory is a waste?

Things we are trying to determine are:

- in the SQL Server memory configuration, is Fixed better than Dynamic? We
have read that Dynamic is not good at returning memory to the OS once it's
been allocated

- What else can we do to optimize the performance for this application? It
seems to me if the indexes are properly designed, the database size
shouldn't have that much impact on performance, but this appears to be true
only to a point. In comparing the execution plans between say a 12 month and
a 3 month database, the plans are sometimes dramatically different. I assume
this is due to the optimizer deciding that going directly to the base tables
and not using an index will result in better performance, when in reality,
this doesn't always appear to be true.

- Are there other SQL Server switches I should be tweaking? Is there some
number of simultaneous queries that this configuration should be limited to?

- What about other versions of SQL Server (e.g. Enterprise, Data Center,
etc) would these buy us anything?

Thanks for any advice,

-Gary

Jul 23 '05 #1
Share this Question
Share on Google+
2 Replies


P: n/a
Gary,
I suppose I would have to have more information. Performance and Tuning
is a step by step process (one is always robbing Peter to pay Paul, so
to speak).
how big is the database in question?
what is the RAID configuration (0,1,5,10)?
what does the queries that are running look like? have you looked at
the execution plan to make sure the indexes are being used? (a nested
iteration join with a table scan can cause massive slow downs).
how big are the tables?

it might be the case that you are suffering from spinlock contention
(context switching on the processor)...in which case you might need to
have more processors. you can run a profile on the processor
performance to get the proper information. however, i would strongly
suggest looking at the logical design of the database, the query
itself, the statistics on the objects involved and the indexes. the
optimizer could be ignoring your indexes because the statistics are out
of wack or the weight of the search is to high.

there are just too many factors that could be responsible for the
slowness for me to answer without knowing them.
there is a decent performance and tuning section in the manual, if you
want to give it a look.

hth.

hans nelsen

Jul 23 '05 #2

P: n/a
Here are my two cents. Mileage may vary, past results do not guarantee
future performance etc...

---- in the SQL Server memory configuration, is Fixed better than
Dynamic? We have read that Dynamic is not good at returning memory to
the OS once it's been allocated

CHOOSE DYNAMIC. MSSQL is meant to function on its own box. It will
take over all the resources. If you're running MSFT small business
server - then you might want fixed.

---- What else can we do to optimize the performance for this
application? It seems to me if the indexes are properly designed, the
database size shouldn't have that much impact on performance, but this
appears to be true only to a point. In comparing the execution plans
between say a 12 month and a 3 month database, the plans are sometimes
dramatically different. I assume this is due to the optimizer deciding
that going directly to the base tables
and not using an index will result in better performance, when in
reality,this doesn't always appear to be true.

COMMON PROBLEMS. The most common problems are user locks. Unlike
Oracle, MSSQL is lock crazy. It places locks on everything. Ensure
that read-locks are not competing against write-locks. Also ensure
that your indexes are being regularly defragged. DBCC DBREINDEX or
DBCC INDEXDEFRAG. Looking at the execution plan in query analyzer --
should tell you. Auto create statistics and auto update statistics
should also be turned on.

---- Are there other SQL Server switches I should be tweaking? Is there
some number of simultaneous queries that this configuration should be
limited to?

Look at this free article from microsoft/technet.
http://www.microsoft.com/technet/pro.../c02ppcsq.mspx
Here is a quote from it:
Caution: Setting fixed memory incorrectly can cause serious performance
problems on SQL Server. Use fixed memory only in circumstances when you
need to ensure that an exact amount of memory is available for SQL
Server.
---- What about other versions of SQL Server (e.g. Enterprise, Data
Center, etc) would these buy us anything?
No. Standard works just fine for 99% of users. Enterprise does get
you Reporting Services and indexed views and log shipping.

Jul 23 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.