On Mar 8, 2:17 pm, "Mark Rae" <m...@markNOSPAMrae.comwrote:
"Aspiring .NET Programmer" <getsuma...@gmail.comwrote in messagenews:11**********************@64g2000cwx.go oglegroups.com...
Thank you guys!...Would the hardware cofiguration of the machine on
which IIS is running have an affect on the performance? Will a more
powerful machine with bigger memory and processing power perform
faster.
Generally speaking, yes, though bigger and faster hard disk(s) will also
help too...
How do we determine whats good enough??
There's really no easy answer to this one - however, this wouldn't be a bad
place to start:http://www.microsoft.com/downloads/d...D=e2c0585a-062...
My app sometime referes to SQL Server tables that has around 2 million
records in them. Even a simple statement like "select column1 from
table1, table2 where table1.id= table2.id and table1.id=XYZ" over two
such large tables causes the application to slowdown drastically and
sometimes timeout since the result set was not fetched in time. Again,
willl a more powerful machine give better performance? FYI: The same
sql statement takes something like 2 mins in the Query Analyzer.
This begs the obvious question: are your tables properly indexed? *Storing*
2 million records will not phase SQL Server one bit. However, if you're
trying to *fetch* 2 million records from your database to your webserver,
then you really need to rethink your application...
Thanks for your reply! I will take a look at that tool you suggested.
There are two extremes of the resultset we expect from the tables.
1) Fetch just a few records out of the 2 millon records that satisfy a
certain criteria (where clause) after a join operation among large
tables and display in a DataGrid.
2) Get all the records from 2 tables with 2 million records each and
have some relationship associated with them without any where clause
and populate the same DataGrid.
Its basically a search module where the user is given 5 parameters to
fillin their search criteria. If the user want to just get all the
records he doesnt provide any search criteria and clicks on search.
This seach needs to access these huge tables.
In both cases there is obviously a join operation on 2 large tables.
It takes around 2-3 mins for such a query to run even in SQL Server
Query Analyzer or Enterprise Manager. How do we improve this in the
first place? If we just look at the first scenario for now, there isnt
a problem with loading large amount of data into the webserver since
we are only fetching a few(10 records for eg.) records and only those
would be loaded into the webserver.
Thanks for ur time.
Sum.