473,569 Members | 2,991 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Large XML files

We are looking at creating large XML files containing binary data
(encoded as base64) and passing them to transformers that will parse
and transform the data into different formats.

Basically, we have images that have associated metadata and we are
trying to develop a unified delivery mechanism. Our XML documents may
be as large as 1GB and contain up to 100,000 images.

My question is, has anyone done anything like this before?

What are the performance considerations?

Do the current parsers support this size of XML file?

Has anyone used fast infoset for this type of problem?

Is there a better way to deliver large sets of binary files (i.e. zip
files or something like that)?

Any input would be great. If there is a better board to post this,
please let me know.

Thx,

Bret

Dec 20 '05 #1
2 1953
jdev8080 wrote:
Basically, we have images that have associated metadata and we are
trying to develop a unified delivery mechanism. Our XML documents may
be as large as 1GB and contain up to 100,000 images.

My question is, has anyone done anything like this before?
Yes, Andrew Schorr told me that he processes files
of this size. After some experiments with Pyxie, he
now uses xgawk with the XML extension of GNU Awk.

http://home.vrweb.de/~juergen.kahrs/gawk/XML/
What are the performance considerations?
Andrew stores each item in a separate XML file and
the concatenates all the XML files to one large file,
often large than 1 GB. My own performance measurements
tell me that a modern PC should parse about 10 MB/s.
Do the current parsers support this size of XML file?
Yes, but probably only SAX-like parsers.
DOM-like parsers have to store the complete file
in memory and are therefore limited by the amount
of memory. In reality, no DOM parsers to date is able
to read XML files larger than about 500 M. If I am wrong
about this, I bet that someone will correct me.
Is there a better way to deliver large sets of binary files (i.e. zip
files or something like that)?


I store such files in .gz format. When reading them, it
is a good idea _not_ to unzip them. Use gzip to produce
a stream of data which will be immediately processed by
the SAX parser:

gzip -c large_file.xml | parser ...

The advantage of this approach is that at each time instant,
only part of the file will occupy space in memory. This is
extremely fast and your server can run a hundred of such
processes on each CPU in parallel.
Dec 20 '05 #2
You can also try VTD-XML (http://vtd-xml.sf.net), which uses about 1.3~1.5x
the
size of XML file. Currently it only supports files size of 1GB, so if you
have 2GB of
physical memory, you can load everything in memory and perform random access
on
it like DOM (of course with DOM will get outOfMem exception). Support for
large
files are on the way.

"Jürgen Kahrs" <Ju************ *********@vr-web.de> wrote in message
news:40******** *****@individua l.net...
jdev8080 wrote:
Basically, we have images that have associated metadata and we are
trying to develop a unified delivery mechanism. Our XML documents may
be as large as 1GB and contain up to 100,000 images.

My question is, has anyone done anything like this before?


Yes, Andrew Schorr told me that he processes files
of this size. After some experiments with Pyxie, he
now uses xgawk with the XML extension of GNU Awk.

http://home.vrweb.de/~juergen.kahrs/gawk/XML/
What are the performance considerations?


Andrew stores each item in a separate XML file and
the concatenates all the XML files to one large file,
often large than 1 GB. My own performance measurements
tell me that a modern PC should parse about 10 MB/s.
Do the current parsers support this size of XML file?


Yes, but probably only SAX-like parsers.
DOM-like parsers have to store the complete file
in memory and are therefore limited by the amount
of memory. In reality, no DOM parsers to date is able
to read XML files larger than about 500 M. If I am wrong
about this, I bet that someone will correct me.
Is there a better way to deliver large sets of binary files (i.e. zip
files or something like that)?


I store such files in .gz format. When reading them, it
is a good idea _not_ to unzip them. Use gzip to produce
a stream of data which will be immediately processed by
the SAX parser:

gzip -c large_file.xml | parser ...

The advantage of this approach is that at each time instant,
only part of the file will occupy space in memory. This is
extremely fast and your server can run a hundred of such
processes on each CPU in parallel.

Jan 9 '06 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
2081
by: Edvard Majakari | last post by:
Hi all ya unit-testing experts there :) Code I'm working on has to parse large and complex files and detect equally complex and large amount of errors before the contents of the file is fed to module interpreting it. First I created a unit-test class named TestLoad which loaded, say, 40 files of which about 10 are correct and other 30 files...
6
2639
by: Greg | last post by:
I am working on a project that will have about 500,000 records in an XML document. This document will need to be queried with XPath, and records will need to be updated. I was thinking about splitting up the XML into several XML documents (perhaps 50,000 per document) to be more efficient but this will make things a lot more complex because...
3
6317
by: Buddy Ackerman | last post by:
I'm trying to write files directly to the client so that it forces the client to open the Save As dialog box rather than display the file. On some occasions the files are very large (100MB+). On these files teh time that it takes until the client displays the Save As dialog can be extrordinarily long (3+ minutes). I don't understand why. I...
3
2325
by: A.M-SG | last post by:
Hi, I have a ASP.NET aspx file that needs to pass large images from a network storage to client browser. The requirement is that users cannot have access to the network share. The aspx file must be the only method that users receive image files.
20
4240
by: mike | last post by:
I help manage a large web site, one that has over 600 html pages... It's a reference site for ham radio folks and as an example, one page indexes over 1.8 gb of on-line PDF documents. The site is structured as an upside-down tree, and (if I remember correctly) never more than 4 levels. The site basically grew (like the creeping black...
1
6306
by: Lars B | last post by:
Hey guys, I have written a C++ program that passes data from a file to an FPGA board and back again using software and DMA buffers. In my program I need to compare the size of a given file against a software buffer of size 3MB. This is needed so as to see which function to use to read from the file. As the files used range from very large...
8
6379
by: theCancerus | last post by:
Hi All, I am not sure if this is the right place to ask this question but i am very sure you may have faced this problem, i have already found some post related to this but not the answer i am looking for. My problem is that i have to upload images and store them. I am using filesystem for that. setup is something like this, their will...
1
3871
by: =?Utf-8?B?UVNJRGV2ZWxvcGVy?= | last post by:
Using .NET 2.0 is it more efficient to copy files to a single folder versus spreading them across multiple folders. For instance if we have 100,000 files to be copied, Do we copy all of them to a single folder called 'All Files' Do we spread them out and copy them to multiple folders like Folder 000 - Copy files from 0 to 1000 Folder 001...
17
9887
by: byte8bits | last post by:
How does C++ safely open and read very large files? For example, say I have 1GB of physical memory and I open a 4GB file and attempt to read it like so: #include <iostream> #include <fstream> #include <string> using namespace std; int main () {
0
7618
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language...
0
7926
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. ...
1
7679
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For...
0
6287
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then...
1
5514
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes...
0
3647
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
2117
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
1
1228
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
0
946
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.