473,785 Members | 2,987 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Loading large file in memory or not?

I'm writing a program that will query an XML file with the XmlTextReader
class (the question is not specifically about XML, however). This file is
very large (maybe 20 MB at most), and during extreme conditions, it'll have
to be queried thousands of times (maybe 20,000 times at most). Small
fragments of this file are used to construct another file.

What is most economical, in terms of system resources? Opening and closing
the file + initializing and destroying the XmlTextReader thousands of times,
or loading all the contents in the file in memory, and pick data from a tree
in memory? Also, would it be more efficient to query a table in SQL Server
thousands of times?

Thanks,

Gustaf

Nov 15 '05 #1
2 3457
"Gustaf Liljegren" <gu************ **@bredband.net > wrote in message
news:uj******** ******@TK2MSFTN GP09.phx.gbl...
I'm writing a program that will query an XML file with the XmlTextReader
class (the question is not specifically about XML, however). This file is
very large (maybe 20 MB at most), and during extreme conditions, it'll have to be queried thousands of times (maybe 20,000 times at most). Small
fragments of this file are used to construct another file.

What is most economical, in terms of system resources? Opening and closing
the file + initializing and destroying the XmlTextReader thousands of times, or loading all the contents in the file in memory, and pick data from a tree in memory? Also, would it be more efficient to query a table in SQL Server
thousands of times?

Thanks,

Gustaf


Do it in memory unless you're really strapped for memory - continually
opening, parsing and closing a file will be really slow and unless each time
you're only using extremely small fragments to construct your other file,
you'll probably end up reading a lot of the original in anyway. 20MB really
isn't a lot of memory nowadays, and a tree representation of it in memory
won't be a lot larger. In terms of SQL, again, you'll trade off speed for
memory usage, but unless your machine really has no spare memory (or you
don't care about speed) just read it in once to a tree you can search
efficiently.

Steve
Nov 15 '05 #2
"Steve McLellan" <sj*@fixerlabs. com.NOSPAM> wrote:
Do it in memory unless you're really strapped for memory - continually
opening, parsing and closing a file will be really slow and unless each time you're only using extremely small fragments to construct your other file,
you'll probably end up reading a lot of the original in anyway. 20MB really isn't a lot of memory nowadays, and a tree representation of it in memory
won't be a lot larger.


Thank you very much. That's a convincing answer.

Gustaf

Nov 15 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
2922
by: Philipp K. Janert, Ph.D. | last post by:
Dear All! I am trying to load a relatively large table (about 1 Million rows) into an sqlite table, which is kept in memory. The load process is very slow - on the order of 15 minutes or so. I am accessing sqlite from Python, using the pysqlite driver. I am loading all records first using cx.execute( "insert ..." ). Only once I have run cx.execute() for all records, I commit all
2
5001
by: Wizfrog | last post by:
Hello, I'm working with a pretty large XML file, but I really only need to display a few things that requires quite a few transforms. I already limited to the transforms to the data i need to use, but I'd like to speed things up by loading only the data I need. I need to mention that this is for a local application that sometimes will lookup updates on a server, but mostly, it is for local use (offline)
4
6668
by: =?Utf-8?B?UGhpbCBKb2huc29u?= | last post by:
Hello, I am working on a peice of code that takes a file and loads it into a SQL Server DB. When I load large files (over 70Mb) I get a system out of memory exception. My machine has 2G RAM I think the problem is that a file stream pointing to the file on the file system is loaded into a memory stream one byte at a time. Does anybody have
5
1286
by: s0suk3 | last post by:
Hi, I wanted to know how cautious it is to do something like: f = file("filename", "rb") f.read() for a possibly huge file. When calling f.read(), and not doing anything with the return value, what is Python doing internally? Is it loading the content of the file into memory (regardless of whether it is discarding it immediately)?
17
9953
by: byte8bits | last post by:
How does C++ safely open and read very large files? For example, say I have 1GB of physical memory and I open a 4GB file and attempt to read it like so: #include <iostream> #include <fstream> #include <string> using namespace std; int main () {
0
912
by: =?Utf-8?B?UGhpbCBKb2huc29u?= | last post by:
Hi, I am working on an ASP.NET 1.1 application that creates thumbnails using the code below. Problem is that the code loads an entire file into memory to instantiate the image (file.Data is a memorystream). This obviously isnt feasible for large files. Does anybody know of a thumbnail component that doesn't load the entire file into memory?
6
7482
by: Terry Carroll | last post by:
I am trying to do something with a very large tarfile from within Python, and am running into memory constraints. The tarfile in question is a 4-gigabyte datafile from freedb.org, http://ftp.freedb.org/pub/freedb/ , and has about 2.5 million members in it. Here's a simple toy program that just goes through and counts the number of members in the tarfile, printing a status message every N records (N=10,000 for the smaller file;...
2
1355
by: Kris Kennaway | last post by:
I would like to MIME encode a message from a large file without first loading the file into memory. Assume the file has been pre-encoded on disk (actually I am using encode_7or8bit, so the encoding should be null). Is there a way to construct the flattened MIME message such that data is streamed from the file as needed instead of being resident in memory? Do I have to subclass the MIMEBase class myself? Kris
5
2949
by: DR | last post by:
Why is its substantialy slower to load 50GB of gzipped file (20GB gzipped file) then loading 50GB unzipped data? im using System.IO.Compression.GZipStream and its not maxing out the cpu while loading the gzip data! Im using the default buffer of the stream that i open on the 20GB gzipped file and pass it into the GZipStream ctor. then System.IO.Compression.GZipStream takes an hour! when just loading 50GB file of data takes a few minutes!
0
9645
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9480
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10327
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
10092
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
1
7499
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6740
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5511
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4053
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2879
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.