By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
449,156 Members | 1,047 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 449,156 IT Pros & Developers. It's quick & easy.

Loading large file in memory or not?

P: n/a
I'm writing a program that will query an XML file with the XmlTextReader
class (the question is not specifically about XML, however). This file is
very large (maybe 20 MB at most), and during extreme conditions, it'll have
to be queried thousands of times (maybe 20,000 times at most). Small
fragments of this file are used to construct another file.

What is most economical, in terms of system resources? Opening and closing
the file + initializing and destroying the XmlTextReader thousands of times,
or loading all the contents in the file in memory, and pick data from a tree
in memory? Also, would it be more efficient to query a table in SQL Server
thousands of times?

Thanks,

Gustaf

Nov 15 '05 #1
Share this Question
Share on Google+
2 Replies


P: n/a
"Gustaf Liljegren" <gu**************@bredband.net> wrote in message
news:uj**************@TK2MSFTNGP09.phx.gbl...
I'm writing a program that will query an XML file with the XmlTextReader
class (the question is not specifically about XML, however). This file is
very large (maybe 20 MB at most), and during extreme conditions, it'll have to be queried thousands of times (maybe 20,000 times at most). Small
fragments of this file are used to construct another file.

What is most economical, in terms of system resources? Opening and closing
the file + initializing and destroying the XmlTextReader thousands of times, or loading all the contents in the file in memory, and pick data from a tree in memory? Also, would it be more efficient to query a table in SQL Server
thousands of times?

Thanks,

Gustaf


Do it in memory unless you're really strapped for memory - continually
opening, parsing and closing a file will be really slow and unless each time
you're only using extremely small fragments to construct your other file,
you'll probably end up reading a lot of the original in anyway. 20MB really
isn't a lot of memory nowadays, and a tree representation of it in memory
won't be a lot larger. In terms of SQL, again, you'll trade off speed for
memory usage, but unless your machine really has no spare memory (or you
don't care about speed) just read it in once to a tree you can search
efficiently.

Steve
Nov 15 '05 #2

P: n/a
"Steve McLellan" <sj*@fixerlabs.com.NOSPAM> wrote:
Do it in memory unless you're really strapped for memory - continually
opening, parsing and closing a file will be really slow and unless each time you're only using extremely small fragments to construct your other file,
you'll probably end up reading a lot of the original in anyway. 20MB really isn't a lot of memory nowadays, and a tree representation of it in memory
won't be a lot larger.


Thank you very much. That's a convincing answer.

Gustaf

Nov 15 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.