By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,491 Members | 1,212 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,491 IT Pros & Developers. It's quick & easy.

Stored Procedures on SQL 2000 compared to dotnet's Datatable.select

P: n/a
Hi,
Has any one ever compared the performance of calling a DataTable's
Select method with a stored procedure doing the same thing?
My point is:
dataRows = DataTable.Select(filter) is better or
Passing paramters to stored procedure?
The datatable holds about 500-700 rows at any given time.

If I select one of the approaches the business logic will go into
respective layers.With dotnet in picture what would be a good approach
- Have the data in Datatable and do a filter on the data or call
stored procedures which has been the convention.
Can some one pl. suggest?
Jul 20 '05 #1
Share this Question
Share on Google+
3 Replies


P: n/a

"nandan" <ma****@hotmail.com> wrote in message
news:c7**************************@posting.google.c om...
Hi,
Has any one ever compared the performance of calling a DataTable's
Select method with a stored procedure doing the same thing?
My point is:
dataRows = DataTable.Select(filter) is better or
Passing paramters to stored procedure?
The datatable holds about 500-700 rows at any given time.

If I select one of the approaches the business logic will go into
respective layers.With dotnet in picture what would be a good approach
- Have the data in Datatable and do a filter on the data or call
stored procedures which has been the convention.
Can some one pl. suggest?


In general, any sort of filtering will be faster on the server, where you
have an optimized database engine which can use indexes and caching to speed
up the operation. An exception to this might be where you return an initial
result set to the client, and then the client needs to filter within that
set, perhaps in several different ways. It might be more efficient to let
the client use the local Datatable in that case, instead of querying the
server for the same data every time, especially if bandwidth or concurrency
are issues. But I have to admit that I have very little understanding of how
Datatables work, so there may be other considerations also - you might want
to post this in an ADO or .Net group to get a better answer.

Simon
Jul 20 '05 #2

P: n/a
[posted and mailed, please reply in news]

nandan (ma****@hotmail.com) writes:
Has any one ever compared the performance of calling a DataTable's
Select method with a stored procedure doing the same thing?
My point is:
dataRows = DataTable.Select(filter) is better or
Passing paramters to stored procedure?
The datatable holds about 500-700 rows at any given time.

If I select one of the approaches the business logic will go into
respective layers.With dotnet in picture what would be a good approach
- Have the data in Datatable and do a filter on the data or call
stored procedures which has been the convention.
Can some one pl. suggest?


I have not made any actual performance, but it goes without saying
that if you get all rows from a table or a stored procedure, and
only intended to display a subset, that you should filter down in
SQL Server. This will reduce network traffic and memory consumption
on the client. It can also make the query perform considerably faster.

However, if you are giving the user the ability to change what data
he sees on the fly, it's another matter. Say that the user first
wants to see squared widgets, and then want to see green wdigets. Now
it makes more sense to get all widgets to the client, since else you
would have to requery for each new selection the user makes.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3

P: n/a
Erland Sommarskog <es****@sommarskog.se> wrote in message news:<Xn**********************@127.0.0.1>...
[posted and mailed, please reply in news]

nandan (ma****@hotmail.com) writes:
Has any one ever compared the performance of calling a DataTable's
Select method with a stored procedure doing the same thing?
My point is:
dataRows = DataTable.Select(filter) is better or
Passing paramters to stored procedure?
The datatable holds about 500-700 rows at any given time.

If I select one of the approaches the business logic will go into
respective layers.With dotnet in picture what would be a good approach
- Have the data in Datatable and do a filter on the data or call
stored procedures which has been the convention.
Can some one pl. suggest?
I have not made any actual performance, but it goes without saying
that if you get all rows from a table or a stored procedure, and
only intended to display a subset, that you should filter down in
SQL Server. This will reduce network traffic and memory consumption
on the client. It can also make the query perform considerably faster.


However, if you are giving the user the ability to change what data
he sees on the fly, it's another matter. Say that the user first
wants to see squared widgets, and then want to see green wdigets. Now
it makes more sense to get all widgets to the client, since else you
would have to requery for each new selection the user makes.


Thank you very much for the information.
Jul 20 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.