473,837 Members | 1,441 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Garbage Collection Problems: Performance and Optimization for WebService XmlDocument XPath query


I wrote a webservice to output a report file. The fields of the report
are formatted based on information in an in-memory XmlDocument. As
each row of a SqlDataReader are looped through, a lookup is done, and
format information retrieved.

The performance was extremely poor -- producing about 1000 rows per minute.

However, when I used tracing/logging, my results were inconclusive.
First of all, based on the size of the data and the size of the
XmlDocument, I would have expected the whole process per record to be < 1ms.

I put a statement to record the time, to the millesecond, before each
call to the XmlDocument, and in the routine, before and after each XPath
query. Then I put a statement after each line was written to the text
stream.

What was odd, was that I could see milleseconds being chewed up in the
code, that contributed to the poor performance, the time where it was
chewed up was random! Sometimes the XmlDocument was 0 ms, sometimes
20-30s per lookup. Sometimes, the clock would add ms in the loop that
retrieved the record from the dataset.

Another thing that puzzled me is that as the program ran, performance
*degraded* -- the whole loop and all the individual processes ran slower
and slower!

To me, this indicates severe problems with Ms .NET garbage collection
and memory management.
--
http://www.texeme.com/
Nov 22 '05 #1
5 2518
You shouldn't be using an in-memory XML document and XPath when you care
about performance. XML, by its very nature, is slow. You should be loading
the information from the XML file into a normal class and use a hash-map for
the lookup.

Jonathan

"John Bailo" <ja*****@earthl ink.net> wrote in message
news:2q******** *****@uni-berlin.de...

I wrote a webservice to output a report file. The fields of the report
are formatted based on information in an in-memory XmlDocument. As
each row of a SqlDataReader are looped through, a lookup is done, and
format information retrieved.

The performance was extremely poor -- producing about 1000 rows per minute.
However, when I used tracing/logging, my results were inconclusive.
First of all, based on the size of the data and the size of the
XmlDocument, I would have expected the whole process per record to be < 1ms.
I put a statement to record the time, to the millesecond, before each
call to the XmlDocument, and in the routine, before and after each XPath
query. Then I put a statement after each line was written to the text
stream.

What was odd, was that I could see milleseconds being chewed up in the
code, that contributed to the poor performance, the time where it was
chewed up was random! Sometimes the XmlDocument was 0 ms, sometimes
20-30s per lookup. Sometimes, the clock would add ms in the loop that
retrieved the record from the dataset.

Another thing that puzzled me is that as the program ran, performance
*degraded* -- the whole loop and all the individual processes ran slower
and slower!

To me, this indicates severe problems with Ms .NET garbage collection
and memory management.
--
http://www.texeme.com/

Nov 22 '05 #2
Jonathan Allen wrote:
You shouldn't be using an in-memory XML document and XPath when you care
about performance. XML, by its very nature, is slow. You should be loading
the information from the XML file into a normal class and use a hash-map for
the lookup.
I used a string array.

But what do you mean -- "by it's very nature" -- that is meaningless. An
XmlDocument object should be a b-tree -- in code essentially -- and
hence fast. And my tracing showed that it would sometimes be fast --
0ms and sometimes slow - 20-30ms. Why would it be random -- unless the
..Net memory model is severely flawed.

Why? Also, the performance of the code did not change when I moved it
from a single proc with .5G memory with hyperthreading to a dual proc
with hyperthreading and 2G memory.

The performance was /exactly/ the same! How can that be ? Does .NET
have inherent limitations in terms of accessing system resources ?!

Jonathan

"John Bailo" <ja*****@earthl ink.net> wrote in message
news:2q******** *****@uni-berlin.de...
I wrote a webservice to output a report file. The fields of the report
are formatted based on information in an in-memory XmlDocument. As
each row of a SqlDataReader are looped through, a lookup is done, and
format information retrieved.

The performance was extremely poor -- producing about 1000 rows per


minute.
However, when I used tracing/logging, my results were inconclusive.
First of all, based on the size of the data and the size of the
XmlDocument , I would have expected the whole process per record to be <


1ms.
I put a statement to record the time, to the millesecond, before each
call to the XmlDocument, and in the routine, before and after each XPath
query. Then I put a statement after each line was written to the text
stream.

What was odd, was that I could see milleseconds being chewed up in the
code, that contributed to the poor performance, the time where it was
chewed up was random! Sometimes the XmlDocument was 0 ms, sometimes
20-30s per lookup. Sometimes, the clock would add ms in the loop that
retrieved the record from the dataset.

Another thing that puzzled me is that as the program ran, performance
*degraded* -- the whole loop and all the individual processes ran slower
and slower!

To me, this indicates severe problems with Ms .NET garbage collection
and memory management.
--
http://www.texeme.com/


--
http://www.texeme.com
Nov 22 '05 #3
Jonathan Allen wrote:
You shouldn't be using an in-memory XML document and XPath when you care
about performance. XML, by its very nature, is slow. You should be loading
the information from the XML file into a normal class and use a hash-map for
the lookup.


Here's some sample code that shows exactly what I mean.

I've compiled this code, and run it against the attached xml file.

My results, from running on a P4 workstation are as below.

I have compiled this code both using .Net's compiler and the mono
compiler for Windows ( www.go-mono.com ). The results are exactly the same.

What you see is that the same query, executed over and over again,
sometimes takes 0 seconds and then randomly 16 seconds.

Why would such a thing happen?

16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
15
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
15
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
15
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
15
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
15
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
0
This is book9999
16
This is book9999
0
This is book9999
0
This is book9999

--
http://www.texeme.com/

using System;
using System.Xml;
using System.Xml.XPat h;

namespace XMLSamps
{

public class readwrite {


static void Main(string[] args)
{
// Load your Xml document
XmlDocument mydoc = new XmlDocument();
mydoc.Load(args[0]);
int beg = 0;
//Use selectNodes to get the book node where the attribute "id=1"
//and write out the response
for(int i=0; i<1000; i++)
{
beg = DateTime.Now.Se cond*1000 + DateTime.Now.Mi llisecond;
XmlNode xmn = mydoc.SelectSin gleNode("//book[@id='9999']");
Console.WriteLi ne((DateTime.No w.Second*1000 +DateTime.Now.M illisecond)-beg);
Console.WriteLi ne(xmn.InnerTex t);
}

}

static string getPath()
{
string path;
path = System.IO.Path. GetDirectoryNam e(
System.Reflecti on.Assembly.Get ExecutingAssemb ly().GetName(). CodeBase );
return path;
}
}
}
Nov 22 '05 #4
Thanks for the sample. One note, you might want to try using "Ticks" instead
of seconds and milliseconds. (It doesn't change the result, it just find it
helpful.)

Jonathan
Nov 22 '05 #5
> But what do you mean -- "by it's very nature" -- that is meaningless.

This is the best explaination I've read about why you should avoid using XML
as much as possible.

http://www.joelonsoftware.com/articl...000000319.html

That said, I would like to tell you the lesson I keep forgetting, "Don't
worry about performance until it becomes an issue". If using XML internally
is "fast enough", the don't go off and start building your own classes.
Concentrate on areas where making improvements will actually be noticeable
to the user.
And my tracing showed that it would sometimes be fast --
0ms and sometimes slow - 20-30ms. Why would it be random -- unless the
.Net memory model is severely flawed.
I think it is because you are running multiple applications. That 16 ms
could be the amount of time it takes Windows to check to see if any other
programs want to run.

Jonathan

"John Bailo" <ja*****@earthl ink.net> wrote in message
news:mA******** ********@newsre ad3.news.pas.ea rthlink.net... Jonathan Allen wrote:
You shouldn't be using an in-memory XML document and XPath when you care
about performance. XML, by its very nature, is slow. You should be loading the information from the XML file into a normal class and use a hash-map for the lookup.


I used a string array.

But what do you mean -- "by it's very nature" -- that is meaningless. An
XmlDocument object should be a b-tree -- in code essentially -- and
hence fast. And my tracing showed that it would sometimes be fast --
0ms and sometimes slow - 20-30ms. Why would it be random -- unless the
.Net memory model is severely flawed.

Why? Also, the performance of the code did not change when I moved it
from a single proc with .5G memory with hyperthreading to a dual proc
with hyperthreading and 2G memory.

The performance was /exactly/ the same! How can that be ? Does .NET
have inherent limitations in terms of accessing system resources ?!

Jonathan

"John Bailo" <ja*****@earthl ink.net> wrote in message
news:2q******** *****@uni-berlin.de...
I wrote a webservice to output a report file. The fields of the report
are formatted based on information in an in-memory XmlDocument. As
each row of a SqlDataReader are looped through, a lookup is done, and
format information retrieved.

The performance was extremely poor -- producing about 1000 rows per


minute.
However, when I used tracing/logging, my results were inconclusive.
First of all, based on the size of the data and the size of the
XmlDocument , I would have expected the whole process per record to be <


1ms.
I put a statement to record the time, to the millesecond, before each
call to the XmlDocument, and in the routine, before and after each XPath
query. Then I put a statement after each line was written to the text
stream.

What was odd, was that I could see milleseconds being chewed up in the
code, that contributed to the poor performance, the time where it was
chewed up was random! Sometimes the XmlDocument was 0 ms, sometimes
20-30s per lookup. Sometimes, the clock would add ms in the loop that
retrieved the record from the dataset.

Another thing that puzzled me is that as the program ran, performance
*degraded* -- the whole loop and all the individual processes ran slower
and slower!

To me, this indicates severe problems with Ms .NET garbage collection
and memory management.
--
http://www.texeme.com/


--
http://www.texeme.com

Nov 22 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
2340
by: Bob | last post by:
Are there any known applications out there used to test the performance of the .NET garbage collector over a long period of time? Basically I need an application that creates objects, uses them, and then throws them away and then monitors the garbage collection and store statistics on it, preferably in C#. I want to know what is the longest period of time that an application may lock up while garbage collection is processing. Thanks!
5
471
by: John Bailo | last post by:
I wrote a webservice to output a report file. The fields of the report are formatted based on information in an in-memory XmlDocument. As each row of a SqlDataReader are looped through, a lookup is done, and format information retrieved. The performance was extremely poor -- producing about 1000 rows per minute. However, when I used tracing/logging, my results were inconclusive. First of all, based on the size of the data and the...
1
3646
by: Srini | last post by:
I have written two simple webservice functions and trying to consume them through a client piece. Both the webservice functions have similar signature. -------------------------------------------------------- public string quoteNew(System.Xml.XmlNode passedXML) and public string EmitXml(System.Xml.XmlNode passedXML)
5
2736
by: Kurt Bauer | last post by:
I have an ASP group calendar application which pulls calendar data from Exchange via webdav into an XML string. I then loop the XML nodes to populate a collection of appointments. Finally I use the appointment collection to populate the calendar control. The performance getting the XML data is fine, but loading the data into the collection is slow. My question/problem is should I be using the collection, a dataset, or something else to...
1
4304
by: John A Grandy | last post by:
I've got an app that has hundreds of medium-sized (100s of elements) XML files on disk (not in db). Right now these are loaded via XMLDocument.Load and searched with XPATH. The performance has become unacceptable. Performance improvment strategies I know of: 1. Switching to XMLReader
0
1153
by: billb | last post by:
Hi all... I have a web method that doesvaildation but does not return just the xml from the invoke method, it retruns both the schema and the xml in the difgram .. The webmethod I'm using looks like this and wondered if anyone knows just how to display the xml from the invoke call. Any help would be greatly appreciated. Thanks. Imports System Imports System.Web
0
1994
by: John Bailo | last post by:
Wow, I just figured out some cool stuff thanks to Ken Kolda in m.p.d.f.remoting Now I have my multiuser XmlDocument set to load once at startup using a static invocation of a class. Next I'll apply my mutex and build out the Xpath query methods (!)
4
4515
by: hellrazor | last post by:
Hi there, I'm trying to consume a web-service that is supposed to return a collection of a custom object. The web-service was not created with C# or VS.net. It was created with IBM VisualAge Smalltalk 6. I haven't had problems consuming other web-services but c# seems to choke with "Collection" return types. The collection (or array) is supposed to contain three instances of a custom object called PsmWsResult, which has two variable...
2
4222
by: AllenM | last post by:
I have a project that requires an XML webservice that will receive an xml file. The webservice will need to read the XML file, download a pdf file from a URL stored in the XML file then store the xml data to a SQL table and finally send back an XML confirmation file. I have researched several samples and can not find any that is even close to this task. Any help would be greatly appreciated.
350
11959
by: Lloyd Bonafide | last post by:
I followed a link to James Kanze's web site in another thread and was surprised to read this comment by a link to a GC: "I can't imagine writing C++ without it" How many of you c.l.c++'ers use one, and in what percentage of your projects is one used? I have never used one in personal or professional C++ programming. Am I a holdover to days gone by?
0
9843
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9682
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10566
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10623
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9401
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7806
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5669
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5848
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
3
3124
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.