By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,219 Members | 1,110 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,219 IT Pros & Developers. It's quick & easy.

System.OutOfMemoryException during serialization

P: n/a
Hi to all,

I am getting this System.OutOfMemoryException calling the
Runtime.Serialization.Formatters.Binary.BinaryForm atter.Serialize(<stream>,<Obj>) method.
The type of <streamis IO.MemoryStream

=====Exception:
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException'
was
thrown.
at System.IO.MemoryStream.set_Capacity(Int32 value)
at System.IO.MemoryStream.EnsureCapacity(Int32 value)
at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.BinaryWriter.Write(Double value)
at
System.Runtime.Serialization.Formatters.Binary.__B inaryWriter.WriteValue(I
nternalPrimitiveTypeE code, Object value)
at
System.Runtime.Serialization.Formatters.Binary.__B inaryWriter.WriteMember(
NameInfo memberNameInfo, NameInfo typeNameInfo, Object value)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteKnownValu
eClass(NameInfo memberNameInfo, NameInfo typeNameInfo, Object data)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteMembers(N
ameInfo memberNameInfo, NameInfo memberTypeNameInfo, Object memberData,
WriteObj
ectInfo objectInfo, NameInfo typeNameInfo, WriteObjectInfo memberObjectInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteMemberSet
up(WriteObjectInfo objectInfo, NameInfo memberNameInfo, NameInfo
typeNameInfo, S
tring memberName, Type memberType, Object memberData, WriteObjectInfo
memberObje
ctInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Write(WriteObj
ectInfo objectInfo, NameInfo memberNameInfo, NameInfo typeNameInfo, String[]
mem
berNames, Type[] memberTypes, Object[] memberData, WriteObjectInfo[]
memberObjec
tInfos)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Write(WriteObj
ectInfo objectInfo, NameInfo memberNameInfo, NameInfo typeNameInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Serialize(Obje
ct graph, Header[] inHeaders, __BinaryWriter serWriter, Boolean fCheck)
at
System.Runtime.Serialization.Formatters.Binary.Bin aryFormatter.Serialize(S
tream serializationStream, Object graph, Header[] headers, Boolean fCheck)
at
System.Runtime.Serialization.Formatters.Binary.Bin aryFormatter.Serialize(S
tream serializationStream, Object graph)
at SerializationBenchmark.Module1.Main() in C:\Develop\Projects\Visual
Studio
2005\SerializationBenchmark\SerializationBenchmark \Module1.vb:line 41

=============
After building a benchmark application to reproduce the error (building
progressively larger objects to serialize until getting the error).
To understand the problem, I collected the following informations:
Serialization stream lenght (before exception is thrown) 16,777,214 Bytes
Allocated memory 1,028,914,416 Bytes
[to get the memory I called GC.GetTotalMemory(False)]

Please note my system is a x64 workstation with 6 GB RAM (and Windows Server
2003 R2 SP2 Enterprise). When the exception was thrown the total load of the
system was about 4162 MB.

The application configuration targets specifically x86 platform.

Can anybody explain me this behaviour and how to workaround it? I cannot
understand the why of the exception, since the lenght of the stream is lower
than the Int32.MaxValue and I still have free RAM.
Any help will be appreciated.
Thank you

Jun 23 '07 #1
Share this Question
Share on Google+
8 Replies


P: n/a

there is currently a 2GB limit for all CLR objects, no matter what OS
you are running on 32 / 64 bit.

There is also a practical limit, dependent on the current system
conditions, (memory fragmentation ) that limits the growth to the largest
continguous block of memory available.

a trick is to set the maximum capacity on forehand , cause if the CLR needs
to resize the object it needs twice the memory it currently holds

i hope this helps a bit

"Piggy" wrote:
Hi to all,

I am getting this System.OutOfMemoryException calling the
Runtime.Serialization.Formatters.Binary.BinaryForm atter.Serialize(<stream>,<Obj>) method.
The type of <streamis IO.MemoryStream

=====Exception:
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException'
was
thrown.
at System.IO.MemoryStream.set_Capacity(Int32 value)
at System.IO.MemoryStream.EnsureCapacity(Int32 value)
at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.BinaryWriter.Write(Double value)
at
System.Runtime.Serialization.Formatters.Binary.__B inaryWriter.WriteValue(I
nternalPrimitiveTypeE code, Object value)
at
System.Runtime.Serialization.Formatters.Binary.__B inaryWriter.WriteMember(
NameInfo memberNameInfo, NameInfo typeNameInfo, Object value)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteKnownValu
eClass(NameInfo memberNameInfo, NameInfo typeNameInfo, Object data)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteMembers(N
ameInfo memberNameInfo, NameInfo memberTypeNameInfo, Object memberData,
WriteObj
ectInfo objectInfo, NameInfo typeNameInfo, WriteObjectInfo memberObjectInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteMemberSet
up(WriteObjectInfo objectInfo, NameInfo memberNameInfo, NameInfo
typeNameInfo, S
tring memberName, Type memberType, Object memberData, WriteObjectInfo
memberObje
ctInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Write(WriteObj
ectInfo objectInfo, NameInfo memberNameInfo, NameInfo typeNameInfo, String[]
mem
berNames, Type[] memberTypes, Object[] memberData, WriteObjectInfo[]
memberObjec
tInfos)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Write(WriteObj
ectInfo objectInfo, NameInfo memberNameInfo, NameInfo typeNameInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Serialize(Obje
ct graph, Header[] inHeaders, __BinaryWriter serWriter, Boolean fCheck)
at
System.Runtime.Serialization.Formatters.Binary.Bin aryFormatter.Serialize(S
tream serializationStream, Object graph, Header[] headers, Boolean fCheck)
at
System.Runtime.Serialization.Formatters.Binary.Bin aryFormatter.Serialize(S
tream serializationStream, Object graph)
at SerializationBenchmark.Module1.Main() in C:\Develop\Projects\Visual
Studio
2005\SerializationBenchmark\SerializationBenchmark \Module1.vb:line 41

=============
After building a benchmark application to reproduce the error (building
progressively larger objects to serialize until getting the error).
To understand the problem, I collected the following informations:
Serialization stream lenght (before exception is thrown) 16,777,214 Bytes
Allocated memory 1,028,914,416 Bytes
[to get the memory I called GC.GetTotalMemory(False)]

Please note my system is a x64 workstation with 6 GB RAM (and Windows Server
2003 R2 SP2 Enterprise). When the exception was thrown the total load of the
system was about 4162 MB.

The application configuration targets specifically x86 platform.

Can anybody explain me this behaviour and how to workaround it? I cannot
understand the why of the exception, since the lenght of the stream is lower
than the Int32.MaxValue and I still have free RAM.
Any help will be appreciated.
Thank you
Jun 25 '07 #2

P: n/a

Had some more time to backup my statements :-)

http://msdn2.microsoft.com/en-us/lib...64(VS.80).aspx
http://blogs.msdn.com/joshwil/archiv...10/450202.aspx
Please not the following even on a 64 bit system the object limit is 2 GB so
however if the system has enough ram you can alocate multiple objects of 2
gb

on a 32 bit system you cannot allocate full 2 gb so there sure is an
advantage to have a 64 bit system and to compile in 64 bit modus
HTH

Michel


"M. Posseth" wrote:
>
there is currently a 2GB limit for all CLR objects, no matter what OS
you are running on 32 / 64 bit.

There is also a practical limit, dependent on the current system
conditions, (memory fragmentation ) that limits the growth to the largest
continguous block of memory available.

a trick is to set the maximum capacity on forehand , cause if the CLR needs
to resize the object it needs twice the memory it currently holds

i hope this helps a bit

"Piggy" wrote:
Hi to all,

I am getting this System.OutOfMemoryException calling the
Runtime.Serialization.Formatters.Binary.BinaryForm atter.Serialize(<stream>,<Obj>) method.
The type of <streamis IO.MemoryStream

=====Exception:
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException'
was
thrown.
at System.IO.MemoryStream.set_Capacity(Int32 value)
at System.IO.MemoryStream.EnsureCapacity(Int32 value)
at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.BinaryWriter.Write(Double value)
at
System.Runtime.Serialization.Formatters.Binary.__B inaryWriter.WriteValue(I
nternalPrimitiveTypeE code, Object value)
at
System.Runtime.Serialization.Formatters.Binary.__B inaryWriter.WriteMember(
NameInfo memberNameInfo, NameInfo typeNameInfo, Object value)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteKnownValu
eClass(NameInfo memberNameInfo, NameInfo typeNameInfo, Object data)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteMembers(N
ameInfo memberNameInfo, NameInfo memberTypeNameInfo, Object memberData,
WriteObj
ectInfo objectInfo, NameInfo typeNameInfo, WriteObjectInfo memberObjectInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.WriteMemberSet
up(WriteObjectInfo objectInfo, NameInfo memberNameInfo, NameInfo
typeNameInfo, S
tring memberName, Type memberType, Object memberData, WriteObjectInfo
memberObje
ctInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Write(WriteObj
ectInfo objectInfo, NameInfo memberNameInfo, NameInfo typeNameInfo, String[]
mem
berNames, Type[] memberTypes, Object[] memberData, WriteObjectInfo[]
memberObjec
tInfos)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Write(WriteObj
ectInfo objectInfo, NameInfo memberNameInfo, NameInfo typeNameInfo)
at
System.Runtime.Serialization.Formatters.Binary.Obj ectWriter.Serialize(Obje
ct graph, Header[] inHeaders, __BinaryWriter serWriter, Boolean fCheck)
at
System.Runtime.Serialization.Formatters.Binary.Bin aryFormatter.Serialize(S
tream serializationStream, Object graph, Header[] headers, Boolean fCheck)
at
System.Runtime.Serialization.Formatters.Binary.Bin aryFormatter.Serialize(S
tream serializationStream, Object graph)
at SerializationBenchmark.Module1.Main() in C:\Develop\Projects\Visual
Studio
2005\SerializationBenchmark\SerializationBenchmark \Module1.vb:line 41

=============
After building a benchmark application to reproduce the error (building
progressively larger objects to serialize until getting the error).
To understand the problem, I collected the following informations:
Serialization stream lenght (before exception is thrown) 16,777,214 Bytes
Allocated memory 1,028,914,416 Bytes
[to get the memory I called GC.GetTotalMemory(False)]

Please note my system is a x64 workstation with 6 GB RAM (and Windows Server
2003 R2 SP2 Enterprise). When the exception was thrown the total load of the
system was about 4162 MB.

The application configuration targets specifically x86 platform.

Can anybody explain me this behaviour and how to workaround it? I cannot
understand the why of the exception, since the lenght of the stream is lower
than the Int32.MaxValue and I still have free RAM.
Any help will be appreciated.
Thank you
Jun 25 '07 #3

P: n/a
there is currently a 2GB limit for all CLR objects, no matter what OS
you are running on 32 / 64 bit.
Not true. 64 bit .Net has 64 bit pointers enabling you to use more memory.
That is the whole point of a 64 bit memory model.
There is also a practical limit, dependent on the current system
conditions, (memory fragmentation ) that limits the growth to the largest
continguous block of memory available.
True, though fragmentation is greatly reduced in 64 bit land. Current
systems max out
at about 32 gb, which is a very small percent of the total addressable
space.
There will always be additional page table entries to map too.

I have never seen "Out of Memory" in the 64 bit world. However swap files
are very
slow..
a trick is to set the maximum capacity on forehand , cause if the CLR
needs
in some Java VM's you can set the max size. Never heard of this in the .Net
world.
Have any links to back this up?
to resize the object it needs twice the memory it currently holds
it would have to be greater than 2x, otherwise you would get 2 indentical
objects,
instead of an old one needing to be GC'd, and a new larger one.

i hope this helps a bit
Nope, sorry.
Jun 25 '07 #4

P: n/a

Robert = Piggy ???

Well i am not active in these newsgroups to tell ferry tales

I just shared my knowledge regarding this mather , and for your info i have
programmed on the 64 bit platfom ( coded with beta versions of visual
studio 2005 on beta version of Windows 2003 X64 so i guess i was an early
adopter of the 64 bit platform )

And all of my 64 bit progyys work, one of them reads 17 GB MYSQL dump
files
P.s.

I remember when i had a problem and posted in a newsgroup or forum i was
verry pleased when somebody , tried to help me with my problems , nowadays
you are beeing atacked when the answer is not what the person expected or
thinks what is right
Thank you verry much
Michel Posseth



"Robert" <no@spam.comschreef in bericht
news:ui**************@TK2MSFTNGP03.phx.gbl...
>there is currently a 2GB limit for all CLR objects, no matter what OS
you are running on 32 / 64 bit.

Not true. 64 bit .Net has 64 bit pointers enabling you to use more
memory.
That is the whole point of a 64 bit memory model.
>There is also a practical limit, dependent on the current system
conditions, (memory fragmentation ) that limits the growth to the
largest
continguous block of memory available.

True, though fragmentation is greatly reduced in 64 bit land. Current
systems max out
at about 32 gb, which is a very small percent of the total addressable
space.
There will always be additional page table entries to map too.

I have never seen "Out of Memory" in the 64 bit world. However swap files
are very
slow..
>a trick is to set the maximum capacity on forehand , cause if the CLR
needs

in some Java VM's you can set the max size. Never heard of this in the
.Net world.
Have any links to back this up?
>to resize the object it needs twice the memory it currently holds

it would have to be greater than 2x, otherwise you would get 2 indentical
objects,
instead of an old one needing to be GC'd, and a new larger one.

>i hope this helps a bit

Nope, sorry.


Jun 25 '07 #5

P: n/a
Robert wrote:
I have never seen "Out of Memory" in the 64 bit world.
Well, that would be easy to fix:

public int Test { get { return Test; } }

;)
>to resize the object it needs twice the memory it currently holds

it would have to be greater than 2x, otherwise you would get 2 indentical
objects,
instead of an old one needing to be GC'd, and a new larger one.
That depends on your view. :)

The new allocated array is twice the size of the previous one, so
together they are three times the previus size.

--
Göran Andersson
_____
http://www.guffa.com
Jun 25 '07 #6

P: n/a
Robert = Piggy ???
>
Well i am not active in these newsgroups to tell ferry tales

I just shared my knowledge regarding this mather , and for your info i
have programmed on the 64 bit platfom ( coded with beta versions of
visual studio 2005 on beta version of Windows 2003 X64 so i guess i was
an early adopter of the 64 bit platform )

And all of my 64 bit progyys work, one of them reads 17 GB MYSQL dump
files
P.s.

I remember when i had a problem and posted in a newsgroup or forum i was
verry pleased when somebody , tried to help me with my problems ,
nowadays you are beeing atacked when the answer is not what the person
expected or thinks what is right
This is phrased poorly: "there is currently a 2GB limit for all CLR
objects,"
It should read "2GB limit for *each* CLR *value type*".
I interpreted the original "ALL" as just that, all your objects, taken
together,
can not exceed 2GB.

Was not wanting to attack, but when somebody posts an "answer", that in my
experience is factually incorrect, then a follow up post requesting
additional
details is warranted... When posting it pays to be precise.

I still have zero clue what was meant by "set the maximum capacity on
forehand"..
Perhaps you meant something along the lines of "Declare your arrays at the
module
level and set the size to the maximum needed capacity at initialization,
then re-using
instead of de-allocating it" ?!? This would greatly reduce memory
fragmentation
in the 32 bit world.. But the original statement was unclear, giving no
real direction
on how to implement a fix. And googling on the keywords in the original
statement
would not give relevant results. I feel the keywords in my rewrite are more
likely
to be useful..

NOT Piggy. Check the headers..

Sorry it came across as an attack. I was aiming for constructive criticism
of your writing style. Perhaps you not a native english speaker?
ie "Ferry Tales" - regarding boats perhaps? :)
"Robert" <no@spam.comschreef in bericht
news:ui**************@TK2MSFTNGP03.phx.gbl...
>>there is currently a 2GB limit for all CLR objects, no matter what OS
you are running on 32 / 64 bit.

Not true. 64 bit .Net has 64 bit pointers enabling you to use more
memory.
That is the whole point of a 64 bit memory model.
>>There is also a practical limit, dependent on the current system
conditions, (memory fragmentation ) that limits the growth to the
largest
continguous block of memory available.

True, though fragmentation is greatly reduced in 64 bit land. Current
systems max out
at about 32 gb, which is a very small percent of the total addressable
space.
There will always be additional page table entries to map too.

I have never seen "Out of Memory" in the 64 bit world. However swap
files are very
slow..
>>a trick is to set the maximum capacity on forehand , cause if the CLR
needs

in some Java VM's you can set the max size. Never heard of this in the
.Net world.
Have any links to back this up?
>>to resize the object it needs twice the memory it currently holds

it would have to be greater than 2x, otherwise you would get 2 indentical
objects,
instead of an old one needing to be GC'd, and a new larger one.

>>i hope this helps a bit

Nope, sorry.



Jun 26 '07 #7

P: n/a
>I have never seen "Out of Memory" in the 64 bit world.

Well, that would be easy to fix:

public int Test { get { return Test; } }

;)

:)

1) this is a VB group, your example looks rather c#'ish
2) I think, not sure tho, that vb optimizes this out.
3) I think you would get a stack overflow, not OOM
4) I stand by "never seen System.OutOfMemoryException"
a) Windows allocates VM space like mad
b) Everything gets slow
c) Everything gets paged out, even large parts of the OS
d) Everything gets slower (clicking another app here takes MINUTES to
switch)
e) Windows crashes..

I actually learned #4 the hard way about 4 months ago.

Not an attack..
Jun 26 '07 #8

P: n/a
Robert ,

MS describes it like this

"there is a 2GB limit on the size of an object you can create while running
a 64-bit managed application on a 64-bit Windows operating system."

And yes i have a big excuse to use wrong grammar , mis spell words , or even
using strange sentences , as i am indeed not a native English speaker ;-)


regards

Michel


"Robert" wrote:
Robert = Piggy ???

Well i am not active in these newsgroups to tell ferry tales

I just shared my knowledge regarding this mather , and for your info i
have programmed on the 64 bit platfom ( coded with beta versions of
visual studio 2005 on beta version of Windows 2003 X64 so i guess i was
an early adopter of the 64 bit platform )

And all of my 64 bit progyys work, one of them reads 17 GB MYSQL dump
files
P.s.

I remember when i had a problem and posted in a newsgroup or forum i was
verry pleased when somebody , tried to help me with my problems ,
nowadays you are beeing atacked when the answer is not what the person
expected or thinks what is right

This is phrased poorly: "there is currently a 2GB limit for all CLR
objects,"
It should read "2GB limit for *each* CLR *value type*".
I interpreted the original "ALL" as just that, all your objects, taken
together,
can not exceed 2GB.

Was not wanting to attack, but when somebody posts an "answer", that in my
experience is factually incorrect, then a follow up post requesting
additional
details is warranted... When posting it pays to be precise.

I still have zero clue what was meant by "set the maximum capacity on
forehand"..
Perhaps you meant something along the lines of "Declare your arrays at the
module
level and set the size to the maximum needed capacity at initialization,
then re-using
instead of de-allocating it" ?!? This would greatly reduce memory
fragmentation
in the 32 bit world.. But the original statement was unclear, giving no
real direction
on how to implement a fix. And googling on the keywords in the original
statement
would not give relevant results. I feel the keywords in my rewrite are more
likely
to be useful..

NOT Piggy. Check the headers..

Sorry it came across as an attack. I was aiming for constructive criticism
of your writing style. Perhaps you not a native english speaker?
ie "Ferry Tales" - regarding boats perhaps? :)
"Robert" <no@spam.comschreef in bericht
news:ui**************@TK2MSFTNGP03.phx.gbl...
>there is currently a 2GB limit for all CLR objects, no matter what OS
you are running on 32 / 64 bit.

Not true. 64 bit .Net has 64 bit pointers enabling you to use more
memory.
That is the whole point of a 64 bit memory model.

There is also a practical limit, dependent on the current system
conditions, (memory fragmentation ) that limits the growth to the
largest
continguous block of memory available.

True, though fragmentation is greatly reduced in 64 bit land. Current
systems max out
at about 32 gb, which is a very small percent of the total addressable
space.
There will always be additional page table entries to map too.

I have never seen "Out of Memory" in the 64 bit world. However swap
files are very
slow..

a trick is to set the maximum capacity on forehand , cause if the CLR
needs

in some Java VM's you can set the max size. Never heard of this in the
.Net world.
Have any links to back this up?

to resize the object it needs twice the memory it currently holds

it would have to be greater than 2x, otherwise you would get 2 indentical
objects,
instead of an old one needing to be GC'd, and a new larger one.
i hope this helps a bit

Nope, sorry.



Jun 26 '07 #9

This discussion thread is closed

Replies have been disabled for this discussion.