I'm writing a program that will query an XML file with the XmlTextReader
class (the question is not specifically about XML, however). This file is
very large (maybe 20 MB at most), and during extreme conditions, it'll have
to be queried thousands of times (maybe 20,000 times at most). Small
fragments of this file are used to construct another file.
What is most economical, in terms of system resources? Opening and closing
the file + initializing and destroying the XmlTextReader thousands of times,
or loading all the contents in the file in memory, and pick data from a tree
in memory? Also, would it be more efficient to query a table in SQL Server
thousands of times?
Thanks,
Gustaf