_DD,
The fastest possible serialization will be from creating a custom
routine which will write binary representations of the field values to a
stream. You will have to code the routines for each type, as you will face
an overhead if you use reflection. This is what serialization "suffers"
from now.
Not that I have a problem with optimizations, but you have weigh the
benefits of these optimizations. You will have to add code to every type
every time you add a field.
What are your performance goals? Going for "as fast as possible" isn't
really a good performance goal. Rather, you should have a benchmark and try
and hit that. You might find other areas of the application where you can
make optimizations which would not be do broad (you are talking about a
custom solution which requires implementation of every type that
participates in it, which can be a maintinence nightmare, to say the least)
and help you hit your performance goals.
Hope this helps.
--
- Nicholas Paldino [.NET/C# MVP]
-
mv*@spam.guard.caspershouse.com
"_DD" <_D*@nospam.com> wrote in message
news:vb********************************@4ax.com...
I had one experimented with binary serialization of an ArrayList of
structs (each struct mostly contains strings). Strangely enough, it
did not run as fast as custom XML storage (latter was nothing fancy,
but did not use normal serialization). I was not pressed to get
runtime optimized at the time, so I just went with XML.
The situation has shifted, and I have to figure out the *fastest
possible* way to save and reload that ArrayList. I remember seeing
something on optimizing serialization but could not locate it. Are
there recommended approaches? (again with speed as the major concern)