473,756 Members | 5,129 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

SQL Server 2000 Hardware Recommendations

I'm on the IS team of a medium-sized non-profit with international
reach. We're trying to make some decisions regarding our Web server
and database server as we expand our web site to have more dynamic
content. Currently the database server houses all data pertinent to
the organization (membership data, events, products, etc) in one
database (~2.2 GB) as well as the web site content in a separate
database (~40 MB). The web site pulls from both databases but hits the
content database more often.

In a nutshell, our database server appears to be struggling during
performance testing of the new Web site. We are trying to determine
whether we simply need new hardware, or if there are things we can do
to help MS SQL make better use of the resources we have. The hardware
is a COMPAQ ML370, 1266mhz Pentium III, 1gb RAM, RAID 5 with 3 HD
(10,000rpm) and a COMPAQ Smart Array 5i SCSI controller. The OS is
Windows 2000 (standard) running Microsoft SQL 2000, SP 3a. The Web
server is a 2.8ghz Pentium IV with 2.5gb RAM, RAID 5 with 3 HD (15,000
rpm) running Windows 2000 standard and IIS 5.0. While stress testing
our web site under a moderate load (simulating approximately 20
simultaneous users), the database server processor tends to max out
and stay that way for the duration of the test. Memory and disk access
appear to remain fairly stable -- there isn't a lot of paging going
on, and the disk queue doesn't escalate much if any. The Web server
shows spikes in processor use, but appears to be coping well. However,
under a heavy load, a sql-heavy page can take as long as 90 seconds to
load! We've been assuming that the network is not the issue, as the
servers are communicating over a gigibit backbone and while we've
identified aspects of the ASP code that we can optimize, the database
server seems to be a large part of the problem.

We've reviewed our SQL configuration settings, and they appear to
align with the best practices, which in our case are the default
settings for SQL 2000. We have rebuilt our indexes, and have
defragmented the hard disks on both the database and Web servers.
This, along with changes to the structure of the Web pages themselves,
has led to improvements, but the processor on the database server
seems to be groaning under the strain, and pages are still taking an
unacceptable amount of time to load.

What else should we be looking at? Are there steps we could take to
minimize the load generated by client/server and Web-related traffic,
or specific performance counters that would help us to identify the
problem? Do we just need to look at getting some new hardware? If new
hardware is unavoidable, is there anyone running a similar environment
who could suggest what minimum requirements we should be looking for?
Any suggestions would be much appreciated!
Jul 20 '05 #1
5 2050
In article <89************ **************@ posting.google. com>,
fo******@comcas t.net says...
[snip]

Answered in the IIS group - you should really learn how to post to
multiple groups, you'll get better feedback.

--
--
sp*********@rro hio.com
(Remove 999 to reply to me)
Jul 20 '05 #2
[posted and mailed, please reply in news]

Lorax (fo******@comca st.net) writes:
I'm on the IS team of a medium-sized non-profit with international
reach. We're trying to make some decisions regarding our Web server
and database server as we expand our web site to have more dynamic
content. Currently the database server houses all data pertinent to
the organization (membership data, events, products, etc) in one
database (~2.2 GB) as well as the web site content in a separate
database (~40 MB). The web site pulls from both databases but hits the
content database more often.

In a nutshell, our database server appears to be struggling during
performance testing of the new Web site. We are trying to determine
whether we simply need new hardware, or if there are things we can do
to help MS SQL make better use of the resources we have. The hardware
is a COMPAQ ML370, 1266mhz Pentium III, 1gb RAM, RAID 5 with 3 HD
(10,000rpm) and a COMPAQ Smart Array 5i SCSI controller.


While not a terrifying the server, my gut feeling is that it should be
enough for the moderate load you are mentioning.

You say that you have rebuilt indexes etc, but how much have you looked
at the code itself? It is not unlikely that the performance could be
improved by tuning a couple of common, but ineffecient query. The best
way to find these is to run with Profiler and have a filter that
catches long durations.

Of course, in the end, throwing hardware of the problem may still be a
cheaper way to go, depending on the findings. It could be a case of
adding a single index et voilà! Or it could be that you need to make
a major architectual change in the application.

But you should certainly start aith analysing where the bottlenecks are.
You could save some bucks that way.
--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3

"Lorax" <fo******@comca st.net> wrote in message
news:89******** *************** ***@posting.goo gle.com...
I'm on the IS team of a medium-sized non-profit with international
reach. We're trying to make some decisions regarding our Web server
and database server as we expand our web site to have more dynamic
content. Currently the database server houses all data pertinent to
the organization (membership data, events, products, etc) in one
database (~2.2 GB) as well as the web site content in a separate
database (~40 MB). The web site pulls from both databases but hits the
content database more often.

In a nutshell, our database server appears to be struggling during
performance testing of the new Web site. We are trying to determine
whether we simply need new hardware, or if there are things we can do
to help MS SQL make better use of the resources we have. The hardware
is a COMPAQ ML370, 1266mhz Pentium III, 1gb RAM, RAID 5 with 3 HD
(10,000rpm) and a COMPAQ Smart Array 5i SCSI controller. The OS is
Windows 2000 (standard) running Microsoft SQL 2000, SP 3a. The Web
server is a 2.8ghz Pentium IV with 2.5gb RAM, RAID 5 with 3 HD (15,000
rpm) running Windows 2000 standard and IIS 5.0. While stress testing
our web site under a moderate load (simulating approximately 20
simultaneous users), the database server processor tends to max out
and stay that way for the duration of the test. Memory and disk access
appear to remain fairly stable -- there isn't a lot of paging going
on, and the disk queue doesn't escalate much if any. The Web server
shows spikes in processor use, but appears to be coping well. However,
under a heavy load, a sql-heavy page can take as long as 90 seconds to
load! We've been assuming that the network is not the issue, as the
servers are communicating over a gigibit backbone and while we've
identified aspects of the ASP code that we can optimize, the database
server seems to be a large part of the problem.
Ok, you most likely have a code problem. Despite thinking you've found it,
I'd look there.

(though I'd probably run the Web server on the SQL box and vice versa).

However, the SQL box is still going to be fast enough. With 1 GB of RAM
most of the data will probably be in cache anyway. And as your disk queue
is low, that's more evidence.

COuple of things: What version of SQL Server are you running?

What's "SQL Heavy" mean? Lots of queries? Lots of inserts?

What else should we be looking at? Are there steps we could take to
minimize the load generated by client/server and Web-related traffic,
or specific performance counters that would help us to identify the
problem? Do we just need to look at getting some new hardware? If new
hardware is unavoidable, is there anyone running a similar environment
who could suggest what minimum requirements we should be looking for?
Any suggestions would be much appreciated!


Well, don't know enough to say if our environment is similar (I'd guess not)
but we run LOTS of queries through our servers (one handles millions a day)
on SQL Servers 4 years old.

Jul 20 '05 #4
Thank you both very much for your responses.

Greg, to answer your questions: Our SQL version, we are using
Microsoft SQL 2000 , Standard Edition, SP3a. Our site is SQL-heavy in
the sense that it pulls a lot of information from the
database--meeting information, member information, library resources,
etc are all pulled out of the database. It's fairly infrequent that we
are doing inserts and updates via the web. It's mostly just pulling a
lot of information down using queries embedded in ASP pages, and
displaying that information for the user. Our new web site adds the
element of actually having web page code stored in the database.

Both you, and the previous poster (Erland), seem to agree that queries
are the place to look for optimization. I think our history has
typically involved upgrading hardware when performance becomes an
issue. Do you happen to know of good resources regarding optimization
of SQL queries?

I greatly appreciate your input. Thanks.

Gary

"Greg D. Moore \(Strider\)" <mo************ ****@greenms.co m> wrote in message news:<XN******* ************@tw ister.nyroc.rr. com>...
"Lorax" <fo******@comca st.net> wrote in message
news:89******** *************** ***@posting.goo gle.com...
I'm on the IS team of a medium-sized non-profit with international
reach. We're trying to make some decisions regarding our Web server
and database server as we expand our web site to have more dynamic
content.
<<edited for brevity>>
Memory and disk access
appear to remain fairly stable -- there isn't a lot of paging going
on, and the disk queue doesn't escalate much if any. The Web server
shows spikes in processor use, but appears to be coping well. However,
under a heavy load, a sql-heavy page can take as long as 90 seconds to
load! We've been assuming that the network is not the issue, as the
servers are communicating over a gigibit backbone and while we've
identified aspects of the ASP code that we can optimize, the database
server seems to be a large part of the problem.


Ok, you most likely have a code problem. Despite thinking you've found it,
I'd look there.

(though I'd probably run the Web server on the SQL box and vice versa).

However, the SQL box is still going to be fast enough. With 1 GB of RAM
most of the data will probably be in cache anyway. And as your disk queue
is low, that's more evidence.

COuple of things: What version of SQL Server are you running?

What's "SQL Heavy" mean? Lots of queries? Lots of inserts?

What else should we be looking at? Are there steps we could take to
minimize the load generated by client/server and Web-related traffic,
or specific performance counters that would help us to identify the
problem? Do we just need to look at getting some new hardware? If new
hardware is unavoidable, is there anyone running a similar environment
who could suggest what minimum requirements we should be looking for?
Any suggestions would be much appreciated!


Well, don't know enough to say if our environment is similar (I'd guess not)
but we run LOTS of queries through our servers (one handles millions a day)
on SQL Servers 4 years old.

Jul 20 '05 #5
Lorax (fo******@comca st.net) writes:
Both you, and the previous poster (Erland), seem to agree that queries
are the place to look for optimization. I think our history has
typically involved upgrading hardware when performance becomes an
issue. Do you happen to know of good resources regarding optimization
of SQL queries?


Kalen Delaney's book "Inside SQL Server 2000" is certainly a good book.

http://www.sql-server-performance.com/ is a site that I have not explored
much myself, but the name sounds promising.

But the best way to learn to tune queries is to start working. What I
typically do when I hear the customer complain "the datbase is slow"
is to start the Profiler, and then set up a Profiler trace where filter
on the Duration column to find long-running queries. I also include
Recompile events, which unfortunately does not have a duration in
Profiler, but can be a serious performance killer. (They can boost
performance too!)

I might also use my own aba_lockinfo, which gives me locking and blocking
information, but also the current statement. If the same statement reappears
in several executions, I take that as in indication that this something
to look closer at. You can get aba_lockinfo at
http://www.sommarskog.se/sqlutil/aba_lockinfo.html.

Once you have found the slow queries, you need to find out how to make
them faster. This can be done by adding/dropping/modifying indexes, adding
supplementary information to help the optimizer, or even adding query
hints. If recompiles are the problem, replacing a temp table with a
table variable is worth a try. Admittedly, at this stage it does help
if you have some basic understanding of how indexes work, and the
difference between clustered and non-clustered indexes. Also, under-
standing the different join methods that SQL Server can use helps. This
is material that is covered by Kalen's book.

Before you feel too overwhelmed by this and start looking for consultants
in the Yellow Pages, SQL Server does offer the Index Tuning Wizard that
help you to find suitable indexes. For ITW to be useful, you need a
good collection of workload for it to analyse. I have never used ITW
myself, mainly because the performance problems at our customer sites
are usually related to a certain function. (We're an ISV, so it's our
system that I analyse.)

Finally, I should say that there more than one way to overload an SQL
Server. Cursor and other iterative processing may not show up in
profiler, because each statement takes hardly no time at all, but
they sum up to a lot. And ineffecient communication from the upper
layers where data is retrived in small pieces for business logic
that would best be implemented in SQL Server.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
2299
by: Armusis | last post by:
I have a SQL Server 2000 database that I will need to take from development and put into production. Currently the database is on a workstation running Windows 2K Server for development purposes. The database will have dozens of users with data entry capability and hundreds of users with viewing/querying capability. The users will have a web front end to the database. I need to know the best possible hardware setup I will need to purchase...
6
1764
by: Member | last post by:
Hi group. I have been working with ColdFusion and SQL Server for some time now....abour 4 years I guess. I have developed various web applications successfully. The scenario I am in is that a company would like me to come on with them as a lead database admin type guy. I would still do a little CF development....but mostly it will be data management with SQL Server. Now....like I said...I have worked with databases for quite a...
19
14849
by: Thue Tuxen Sørensen | last post by:
Hi everybody ! I´m maintaining a large intranet (approx 10000 concurrent users) running on one IIS box and one DB box with sqlserver 2000. Currently there is 2,5 GB Ram, 1 1400 mhz cpu and 2 scsi disks installed on the db box. Sqlserver is set to use max 1,4 GB RAM, and the sqlserver does not seem to be using it all.
3
2549
by: Greg Adourian | last post by:
Hi, Still in the design process of a Windows 2003 web server with a SQL backend. Expecting to have about 2000 visitors a day accessing lists and search queries on a 200-300 MB db. This server will be collocated in a datacenter. I have a few scenarios that I would appreciate in getting some comments/criticism on: Scenario 1: Box 1, SBS 2003 Premium (with SQL 2000), 2GB RAM, Raid 5 HD
5
1426
by: csomberg | last post by:
Howdy guys. I have the pleasure of assisting a friend with some changes to his system. One thing he needs is a new SQL setup that will scale. I was thinking of a quad exon box, 32 bit win2003 enterprise, 12 gig ram attached to a SAN via fiber with 15,000 rpm drives. If I need better perfromance, I will add servers on top of the san ...
3
2151
by: Rod Early | last post by:
I have the task of converting a SQL Server 2000 database to PostgreSQL. The data itself does not need to be converted, but the structure and stored procedures must be. I expect that converting tables and views will be simple. I expect that converting stored procedures and user-defined functions from T-SQL to PostgreSQL will be more complex. In any case, I am looking for recommendations and counsel regarding my expectations. I have...
3
2230
by: datapro01 | last post by:
I am a DB2 DBA that has been asked to become familiar enough with SQL Server in order to become actively involved in its installation, implementation, and to review database backup/recovery procedures. SQL Server will be used by a vendor packaged application. The problem is that we are awaiting word on whether it will be SQL Server 2000 or 2005 we will be working with. I have been told that there are considerable differences between the...
0
1321
by: Jerad Rose | last post by:
I'm looking for information as to what considerations I should make when configuring an ASP.NET web server. I realize that a lot of factors need to be considered, but I'm just looking for a good starting point. Right now, I have a Windows 2000 (IIS 5) web server (P4 2.8 GHz, 1GB RAM, 80GB SATA HD) that is used to host an ASP site, which gets hit by 400-500 simultaneous users during peak hours (according to perfmon's Current Anonymous...
3
1982
by: Artie | last post by:
Hi all, I was wondering if I could get some experienced opinions on SQL hardware to run an ERP app on SQL 2000. The app does not yet support SQL 2005. The ERP app has 25 users and likely won't exceed 30 users for several years. All traffic is on the LAN. The ERP clients basically submit SQL requests for reads and writes. The app makes heavy use of temp tables, temp views but not many stored procedures. The current size of the db is...
0
9455
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9271
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10031
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
9838
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9708
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
6534
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5302
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3805
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2665
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.