471,610 Members | 1,311 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,610 software developers and data experts.

Large Files with Strongly Typed Dataset

I have an xsd file that I use to create a strongly typed dataset in my
project. In the past, I have used the ReadXml method to load xml files into
the generated class and read data in using this class.

These files have now become big (order of 100MB) and this completely kills
the readxml performance. I have read multiple posts that tell you not to use
datasets for large files. I even tried calling BeginLoadTables before
starting the read.

However, I really need the strongly typed dataset functionality in my
program - it forms the core of my project. Is there a way I can load this
data into the dataset faster (by using a real database or something
similar)? I would not mind if I somehow got status updates when the readxml
is being performed - the read process does not have to be fast.

If you have tried anything like this, please let me know.


Nov 12 '05 #1
0 1766

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

1 post views Thread by Sujith Manuel | last post: by
4 posts views Thread by moondaddy | last post: by
2 posts views Thread by =?Utf-8?B?UGV0ZXI=?= | last post: by
reply views Thread by leo001 | last post: by
reply views Thread by CCCYYYY | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.