By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
458,011 Members | 1,293 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 458,011 IT Pros & Developers. It's quick & easy.

Fastest possible serialization

P: n/a
_DD
I had one experimented with binary serialization of an ArrayList of
structs (each struct mostly contains strings). Strangely enough, it
did not run as fast as custom XML storage (latter was nothing fancy,
but did not use normal serialization). I was not pressed to get
runtime optimized at the time, so I just went with XML.

The situation has shifted, and I have to figure out the *fastest
possible* way to save and reload that ArrayList. I remember seeing
something on optimizing serialization but could not locate it. Are
there recommended approaches? (again with speed as the major concern)
May 10 '06 #1
Share this Question
Share on Google+
3 Replies


P: n/a
_DD,

The fastest possible serialization will be from creating a custom
routine which will write binary representations of the field values to a
stream. You will have to code the routines for each type, as you will face
an overhead if you use reflection. This is what serialization "suffers"
from now.

Not that I have a problem with optimizations, but you have weigh the
benefits of these optimizations. You will have to add code to every type
every time you add a field.

What are your performance goals? Going for "as fast as possible" isn't
really a good performance goal. Rather, you should have a benchmark and try
and hit that. You might find other areas of the application where you can
make optimizations which would not be do broad (you are talking about a
custom solution which requires implementation of every type that
participates in it, which can be a maintinence nightmare, to say the least)
and help you hit your performance goals.

Hope this helps.

--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"_DD" <_D*@nospam.com> wrote in message
news:vb********************************@4ax.com...
I had one experimented with binary serialization of an ArrayList of
structs (each struct mostly contains strings). Strangely enough, it
did not run as fast as custom XML storage (latter was nothing fancy,
but did not use normal serialization). I was not pressed to get
runtime optimized at the time, so I just went with XML.

The situation has shifted, and I have to figure out the *fastest
possible* way to save and reload that ArrayList. I remember seeing
something on optimizing serialization but could not locate it. Are
there recommended approaches? (again with speed as the major concern)

May 10 '06 #2

P: n/a
_DD
On Mon, 8 May 2006 23:29:41 -0400, "Nicholas Paldino [.NET/C# MVP]"
<mv*@spam.guard.caspershouse.com> wrote:
The fastest possible serialization will be from creating a custom
routine which will write binary representations of the field values to a
stream. You will have to code the routines for each type, as you will face
an overhead if you use reflection. This is what serialization "suffers"
from now.
The first method was pretty much by the book, but I may have missed
something. It took about twice as long as parsing the equivalent XML
file.

Do you have a feel for how much of a performance hit is imposed by
reflection? I thought that serialization could be fairly fast,
especially compared to verbose XML. That was a shock.
Not that I have a problem with optimizations, but you have weigh the
benefits of these optimizations. You will have to add code to every type
every time you add a field. What are your performance goals? Going for "as fast as possible" isn't
really a good performance goal. Rather, you should have a benchmark and try
and hit that.
If there was a spec for this particular function, it would be: "Fast
as hell" <g> Seriously, it is one of those things that will end up
'inline' with user interaction, and the most entertaining progress bar
in the world will not help. I'm clocking 2 minutes + loading the XML
file. Serialization took over 4 minutes.

Given the user 'eye-glaze' factor, I'd love to load the file in 15
seconds. I didn't think that would be out of the question. The
largest file is 60 megabytes, which can be drag-drop copied in about 5
seconds.
Hope this helps.


Always, Nicholas!
May 10 '06 #3

P: n/a
_DD
On Mon, 8 May 2006 23:29:41 -0400, "Nicholas Paldino [.NET/C# MVP]"
<mv*@spam.guard.caspershouse.com> wrote:
The fastest possible serialization will be from creating a custom
routine which will write binary representations of the field values to a
stream. You will have to code the routines for each type, as you will face
an overhead if you use reflection. This is what serialization "suffers"
from now.


PS: Aren't there some tricks in regard to getting normal
serialization to run faster?
May 10 '06 #4

This discussion thread is closed

Replies have been disabled for this discussion.