for scalability loading into a dataset (or datatable if single resultset
query) is the best. this releases database locks the fastest, and prevents
connection leaks (not calling close on a connection).
the datareader is the lighest weith, and can be used to load your own custom
businesses object. the methd that creates the reader should close it, never
return one, as this will allow connection leaks to be created. in C# also
use a use statement with datareaders.
-- bruce (sqlwork.com)
"Justin" <no****@nospam.comwrote in message
news:OW**************@TK2MSFTNGP06.phx.gbl...
>I guess I am most interested in picking the method that process the data
the fast, but also I would like to know memory usage, cpu usages etc.
"Justin" <no****@nospam.comwrote in message
news:Os**************@TK2MSFTNGP06.phx.gbl...
>>I already got the data back from the database (so I am not worried about
database related performance testing at this point) and there is a couple
of different ways to process the data and I want to see which one is the
more efficient.
What methods do you use to check quantatively the efficiency of codes in
ASP.NET?
Thanks