Datasets are keeping complete result sets and related data in memory, so you
have to deal with unavoidable performance issues.
If you create data sets and tables programmatically, you must implement some
kind of db engine to persist data and page through big tables / result sets.
Which doesn't make much sense considering availability of various db
engines, both commercial and free ones. All replacements will implement some
kind of db engine. XML won;t solve your performance issues and might make
situation worse, as parsing complete files will clog your memory with lots
of residuals. Also, it'll double memory consumption.
Now you have troubles with 30K rows. Consider what kind of issues you'll
have with 300K rows. Don't bump into brick wall.
Real question is: what kind of issues you have with 30K rows. This number is
not that big. Any specific details?
"Brian Richards" <br*******@pharsight.comwrote in message
news:uT**************@TK2MSFTNGP02.phx.gbl...
Can anyone suggest a replacement, 3rd party or otherwise, object for
DataTable. We're running into performance issues with large dataset (>30k)
rows. Looking for something that can do binary serialization, typed column
row data, primary keys, filtering etc. SQL desktop engine or some other
database is not an option at this point.
Thanks
Briann