"Bihn" wrote:
I was reading about datareader which is said to be slimmer & faster then
dataset. Since the datareader have to go fetching the dat from the database
every time it need it, the data it gets then should be up to date. However,
both the IbuySpy and Duwamish samples and most, if not all, the shopping
cart sample codes I've seen use dataset to implement the opration for
ecommerce sites. So is the trip that the datareader need to go fetch the
data from the database is too much a performance drag or is using the
dataset is "the way" to implement the ecommerce site? Will keeping the
information up-to-date present a problem because the dataset is a
disconnected data access control? Anyone who has implemt ecommerce site
please give me opinion as to which data access control is more suitable to
use to implement an ecommerce site.
Thanks.
Hi, ive led development of large scale ecommerce site (millions of hits/day)
and think that DataSets are the way to go, for following reasons (all IMO of
course):
1. raw performance superiority of the DataReader is unimportant in context
of ecommerce internet apps, where there are many other factors that determine
user-percieved responsiveness.
2. Typical ecommerce pages do not use thousands or even hundreds of rows of
data at a time (if designed correctly, ie. paging done within SPs or the
like). So DataSets in ecommerce world are usually small enough not to cause
alarm in terms of memory.
3. You (and all the developers who build code for your server) need to be
very careful about correctly handling DataReaders in order to avoid memory
leaks. One sloppy dude can ruin everyone's day on that server, and this kind
of error is a nightmare to nail down.
4. DataSets are incredibly easy to work with, esp. with XML and/or XSLT.
And its very easy to look at contents of DataSet during debugging.
5. "disconnectedness" isnt a problem. Just create the DataSet when you are
about to use it. But if you want to use it disconnected, thats cool too,
just cache it.
6. During development you can easily create "fake" DataSets from a data
access layer, without waiting for the DB schema to be finalized or the SPs to
be coded. This enables user-interface coding to proceed without waiting for
the "data team" to get their stuff together.
I can tell you that even with very, uh, "unoptimized" coding, we never had a
problem with performance (avg 1 sec response time). In fact, some of the
main issues which caused outages were related to memory leaks related to
improper handling of DataReaders. I think focusing on the raw speed is
wrong. Think of it this way:
1. would you rather tell a client that your system has a lot of features
because its easy/cheap to maintain and enhance,
2. or, would you rather explain to the client that, yes, your ecommerce site
was down for the day (losing $$$), and yes its taking a long time for the
programmers to add new features you want (spending $$$)... but hey did you
notice that every page request is 0.1 seconds quicker because we're using
DataReaders??
HTH