Amadelle,
That's too generic a question to ask. If you are on a system with a low
amount of RAM compared to a system with gigabytes of RAM, then the size of
the dataset could vary, assuming no other parameters. Of course, if you
have other things going on at the same time that could suck up ram, this
could factor into it as well.
Generally speaking though, I wouldn't get into the habit of passing
around recordsets that are too unwieldy (personally, I think this is in the
tens of thousands and above range, but that depends on what I am doing with
it).
Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
-
mv*@spam.guard.caspershouse.com
"Amadelle" <am******@yahoo.com> wrote in message
news:u5**************@tk2msftngp13.phx.gbl...
Hi all and thanks in advance,
I am thinking of passing around a dataset as parameter to different
methods through my code ...my question is what is a reasonable size for a dataset
in order to avoid performance definiciencies? how many rows of data can a
dataset have before it becomes too large?
I appreciate your comments,
Amadelle