By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,853 Members | 1,012 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,853 IT Pros & Developers. It's quick & easy.

Idea: GC and IDisposable

P: n/a
Hi,

I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.

Yes, I understand that the compiler and the GC doesn't know what IDisposable
is, but surely it is a worthwhile consideration? This garbabe/resource
collection is particularly important in the CF world.

Hilton
Sep 24 '07 #1
Share this Question
Share on Google+
47 Replies


P: n/a
Hilton,

This idea doesn't do much, really. It only serves to reward people for
not being aware of the implementation of their designs.

If you are not aware of the lifespans of the objects that you are using,
then that is an error, either in design, or implementation, or the
understanding of the underlying technologies and it should be fixed. If
something implements IDisposable, that should be a huge red flag that tells
the developer "hey, YOU (not the GC) need to pay extra attention to me, as I
am doing something that you want to pay attention to, and not just leave
lying around."
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Hilton" <no****@nospam.comwrote in message
news:fm***************@newssvr14.news.prodigy.net. ..
Hi,

I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.

Yes, I understand that the compiler and the GC doesn't know what
IDisposable is, but surely it is a worthwhile consideration? This
garbabe/resource collection is particularly important in the CF world.

Hilton


Sep 24 '07 #2

P: n/a
Hilton wrote:
Hi,

I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.

Yes, I understand that the compiler and the GC doesn't know what IDisposable
is, but surely it is a worthwhile consideration? This garbabe/resource
collection is particularly important in the CF world.

Hilton
The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it
in a predictable way, but sometimes also to keep it alive in a
predictable way.

--
Göran Andersson
_____
http://www.guffa.com
Sep 24 '07 #3

P: n/a
Agreed. Implementing IDisposable is a flag to the consumer, not the GC.
The GC doesn't know or care about IDisposable objects. If it did, then you
would have the issues currently surrounding Finalizers to content with (and
that's the exact reason it's recommended that you implement IDisposable
instead of using a Finalizer).

There's no way to "automate" this process without causing pain for those of
us who like performance or adding the requirement of multiple collection
cycles to release resources.

--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com
"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.comwrote in
message news:eB**************@TK2MSFTNGP06.phx.gbl...
Hilton,

This idea doesn't do much, really. It only serves to reward people for
not being aware of the implementation of their designs.

If you are not aware of the lifespans of the objects that you are
using, then that is an error, either in design, or implementation, or the
understanding of the underlying technologies and it should be fixed. If
something implements IDisposable, that should be a huge red flag that
tells the developer "hey, YOU (not the GC) need to pay extra attention to
me, as I am doing something that you want to pay attention to, and not
just leave lying around."
--
- Nicholas Paldino [.NET/C# MVP]
- mv*@spam.guard.caspershouse.com

"Hilton" <no****@nospam.comwrote in message
news:fm***************@newssvr14.news.prodigy.net. ..
>Hi,

I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.

Yes, I understand that the compiler and the GC doesn't know what
IDisposable is, but surely it is a worthwhile consideration? This
garbabe/resource collection is particularly important in the CF world.

Hilton



Sep 24 '07 #4

P: n/a
The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it in
a predictable way, but sometimes also to keep it alive in a predictable
way.
Exactly. I sure wouldn't like all my Form and Control-derived objects being
disposed on every GC, yet I'm pretty sure they implement IDisposable.
>
--
Göran Andersson
_____
http://www.guffa.com

Sep 24 '07 #5

P: n/a
On Sep 24, 4:48 pm, Göran Andersson <gu...@guffa.comwrote:
The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it
in a predictable way, but sometimes also to keep it alive in a
predictable way.
That was a very well thought out response.

Sep 25 '07 #6

P: n/a
"<ctacke/>" <ctacke[at]opennetcf[dot]comwrote in message
news:e%****************@TK2MSFTNGP06.phx.gbl...
Agreed. Implementing IDisposable is a flag to the consumer, not the GC.
The GC doesn't know or care about IDisposable objects. If it did, then
you would have the issues currently surrounding Finalizers to content with
(and that's the exact reason it's recommended that you implement
IDisposable instead of using a Finalizer).

There's no way to "automate" this process without causing pain for those
of us who like performance or adding the requirement of multiple
collection cycles to release resources.
Sure there is. How would you call Dispose? You'd call it on an object -
right? That means that it would not be GC'd (yet) since you are holding a
reference to it, so everything works just fine as it always has. Now when
this object can be GC'd (no refs etc), then 'my' logic kicks in, disposes
the object and frees the memory ***IF*** it hasn't already been disposed.
It's just a safety net, not a new paradigm.

Your code is not affected, you can continue to optimize, but the GC will
dispose of any objects that need to be disposed, but weren't. Sounds good
to me.

Hilton
Sep 25 '07 #7

P: n/a
"Göran Andersson" <gu***@guffa.comwrote in message
news:eU**************@TK2MSFTNGP05.phx.gbl...
The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it in
a predictable way, but sometimes also to keep it alive in a predictable
way.
Just holding a reference to it will keep it alive as it always has.

Hilton
Sep 25 '07 #8

P: n/a
"Nicholas Paldino [.NET/C# MVP]" <mv*@spam.guard.caspershouse.comwrote in
message news:eB**************@TK2MSFTNGP06.phx.gbl...
Hilton,

This idea doesn't do much, really. It only serves to reward people for
not being aware of the implementation of their designs.

If you are not aware of the lifespans of the objects that you are
using, then that is an error, either in design, or implementation, or the
understanding of the underlying technologies and it should be fixed. If
something implements IDisposable, that should be a huge red flag that
tells the developer "hey, YOU (not the GC) need to pay extra attention to
me, as I am doing something that you want to pay attention to, and not
just leave lying around."
That's a great argument for using free/malloc instead of a GC. A GC is
there so that you don't have to worry about this stuff. All we've done is
reduce the set of objects we're watching from "all" to "some". I'm saying
that by adding a "if (!Disposed) o.Dispose()" would solve a lot of bug and
memory leaks - we see them here all the time. I see it in the CF too.

Hilton
Sep 25 '07 #9

P: n/a
But now the GC has to track every object that you've called Dispose on. And
how does it know what resources it holds? Should it keep track of all
object held within some class that called Dispose? What if those objects
have a Dispose method? Should the GC call it that too? What if it's
already been called? Now we need to track that. How about it's internal
objects? Track them too? It gets real ugly real fast. The metadata the GC
would have to hold would be large and the time required to Collect would
lengthen substantially. It's a lot easier, cleaner and certainly faster to
have the one who should know what needs to be Disposed - the application -
handle it.

The GC already spends more time than I'd like doing stuff I can't control.
The last thing I need is it doing more.
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

"Hilton" <no****@nospam.comwrote in message
news:6l*****************@newssvr11.news.prodigy.ne t...
"<ctacke/>" <ctacke[at]opennetcf[dot]comwrote in message
news:e%****************@TK2MSFTNGP06.phx.gbl...
>Agreed. Implementing IDisposable is a flag to the consumer, not the GC.
The GC doesn't know or care about IDisposable objects. If it did, then
you would have the issues currently surrounding Finalizers to content
with (and that's the exact reason it's recommended that you implement
IDisposable instead of using a Finalizer).

There's no way to "automate" this process without causing pain for those
of us who like performance or adding the requirement of multiple
collection cycles to release resources.

Sure there is. How would you call Dispose? You'd call it on an object -
right? That means that it would not be GC'd (yet) since you are holding a
reference to it, so everything works just fine as it always has. Now when
this object can be GC'd (no refs etc), then 'my' logic kicks in, disposes
the object and frees the memory ***IF*** it hasn't already been disposed.
It's just a safety net, not a new paradigm.

Your code is not affected, you can continue to optimize, but the GC will
dispose of any objects that need to be disposed, but weren't. Sounds good
to me.

Hilton


Sep 25 '07 #10

P: n/a
On 24 Sep., 22:46, "Hilton" <nos...@nospam.comwrote:
Hi,

I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.

Yes, I understand that the compiler and the GC doesn't know what IDisposable
is, but surely it is a worthwhile consideration? This garbabe/resource
collection is particularly important in the CF world.

Hilton
I have to agree with the other posters here, I don't think this should
be something the GC handles by itself.
However, if you want your own classes to exhibit a similar behavior,
you could implement a finalizer that calls Dispose and call
GC.SuppressFinalize(this) if Dispose is called manually.
That makes sure that Dispose does get called, even if its in a non-
deterministic way. I believe this method still causes a small overhead
for using a finalizer (even if its not used), but I still do it with
classes that implement IDisposable to make sure clean-up happens at
some point.

hth,
Kevin Wienhold

Sep 25 '07 #11

P: n/a
"KWienhold" <he******@trashmail.netwrote in message
news:11**********************@r29g2000hsg.googlegr oups.com...
On 24 Sep., 22:46, "Hilton" <nos...@nospam.comwrote:
>Hi,

I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.

Yes, I understand that the compiler and the GC doesn't know what
IDisposable
is, but surely it is a worthwhile consideration? This garbabe/resource
collection is particularly important in the CF world.

Hilton

I have to agree with the other posters here, I don't think this should
be something the GC handles by itself.
However, if you want your own classes to exhibit a similar behavior,
you could implement a finalizer that calls Dispose and call
GC.SuppressFinalize(this) if Dispose is called manually.
That makes sure that Dispose does get called, even if its in a non-
deterministic way. I believe this method still causes a small overhead
for using a finalizer (even if its not used), but I still do it with
classes that implement IDisposable to make sure clean-up happens at
some point.

Whoa, wait. Finalize should NEVER call Dispose to clean up managed
resources, since the maaged resource references may no longer exist!

--
Doug Semler, MCPD
a.a. #705, BAAWA. EAC Guardian of the Horn of the IPU (pbuhh).
The answer is 42; DNRC o-
Gur Hfrarg unf orpbzr fb shyy bs penc gurfr qnlf, abbar rira
erpbtavmrf fvzcyr guvatf yvxr ebg13 nalzber. Fnq, vfa'g vg?

Sep 25 '07 #12

P: n/a
Hilton <no****@nospam.comwrote:
If you are not aware of the lifespans of the objects that you are
using, then that is an error, either in design, or implementation, or the
understanding of the underlying technologies and it should be fixed. If
something implements IDisposable, that should be a huge red flag that
tells the developer "hey, YOU (not the GC) need to pay extra attention to
me, as I am doing something that you want to pay attention to, and not
just leave lying around."

That's a great argument for using free/malloc instead of a GC. A GC is
there so that you don't have to worry about this stuff.
No, a GC is there so that you don't have to worry about this stuff *for
memory resources*. Not for other resources.
All we've done is reduce the set of objects we're watching from "all" to
"some".
That's a pretty big improvement, IMO.
I'm saying that by adding a "if (!Disposed) o.Dispose()" would solve a
lot of bug and memory leaks - we see them here all the time.
No, it wouldn't. You'd still have race conditions - you'd just see them
*slightly* less often, making them harder to find and fix. Any
IDisposable which directly has a handle on an unmanaged resource should
have a finalizer which calls Dispose anyway.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 25 '07 #13

P: n/a
Hilton wrote:
[...]
Your code is not affected, you can continue to optimize, but the GC will
dispose of any objects that need to be disposed, but weren't. Sounds good
to me.
I'm not sure I understand the point you're trying to make. How is what
you're suggesting different from the already-existing finalizer
paradigm? Other than the fact that the finalizer needs to explicitly be
written to call Dispose(), I mean.

Is that all it is? Or is there something different you're getting at?

Pete
Sep 25 '07 #14

P: n/a
Doug Semler wrote:
[...]
Whoa, wait. Finalize should NEVER call Dispose to clean up managed
resources, since the maaged resource references may no longer exist!
I don't disagree. But I was just assuming that we were talking about
unmanaged resources here. After all, those are the ones that cause the
most trouble if not disposed properly.

Did I miss something?

Pete
Sep 25 '07 #15

P: n/a
On 25 Sep., 08:37, "Doug Semler" <dougsem...@gmail.comwrote:
"KWienhold" <hedov...@trashmail.netwrote in message

news:11**********************@r29g2000hsg.googlegr oups.com...


On 24 Sep., 22:46, "Hilton" <nos...@nospam.comwrote:
Hi,
I'm sure I'm simplifying things here, but how about if the GC did this to
objects that implement IDisposable:
1. Always Generation 1 (I think that is the correct name)
2. Get aggressive with them:
a. Nuke 'em on a GC.Collect call (or equivalent)
b. Call Dispose on the object.
Yes, I understand that the compiler and the GC doesn't know what
IDisposable
is, but surely it is a worthwhile consideration? This garbabe/resource
collection is particularly important in the CF world.
Hilton
I have to agree with the other posters here, I don't think this should
be something the GC handles by itself.
However, if you want your own classes to exhibit a similar behavior,
you could implement a finalizer that calls Dispose and call
GC.SuppressFinalize(this) if Dispose is called manually.
That makes sure that Dispose does get called, even if its in a non-
deterministic way. I believe this method still causes a small overhead
for using a finalizer (even if its not used), but I still do it with
classes that implement IDisposable to make sure clean-up happens at
some point.

Whoa, wait. Finalize should NEVER call Dispose to clean up managed
resources, since the maaged resource references may no longer exist!

--
Doug Semler, MCPD
a.a. #705, BAAWA. EAC Guardian of the Horn of the IPU (pbuhh).
The answer is 42; DNRC o-
Gur Hfrarg unf orpbzr fb shyy bs penc gurfr qnlf, abbar rira
erpbtavmrf fvzcyr guvatf yvxr ebg13 nalzber. Fnq, vfa'g vg?- Zitierten Text ausblenden -

- Zitierten Text anzeigen -
I was thinking of unmanaged resources, since those are normally the
ones that need to be cleaned up in the Dispose.
The GC handles cleaning up of managed resources just fine as far as I
know, so I don't really understand why one would even use IDisposable
on a class containing only managed resources.

Kevin Wienhold

Sep 25 '07 #16

P: n/a
KWienhold wrote:
I was thinking of unmanaged resources, since those are normally the
ones that need to be cleaned up in the Dispose.
Me too. :)
The GC handles cleaning up of managed resources just fine as far as I
know, so I don't really understand why one would even use IDisposable
on a class containing only managed resources.
I suppose one reason might be if you had a class that you wanted to keep
a reference to, but you were only interested in some post-processing
state and wanted to go ahead and release whatever managed resources were
used to get to that state.

I can't think of a class off the top of my head that uses Dispose() in
that way, but it wouldn't surprise me if it exists.

But this isn't a case where you need the GC to do the Dispose() for you.
It's more an example of explicitly controlling the lifetime of resources.

Pete
Sep 25 '07 #17

P: n/a
On 25 Sep., 10:27, Peter Duniho <NpOeStPe...@NnOwSlPiAnMk.comwrote:
I suppose one reason might be if you had a class that you wanted to keep
a reference to, but you were only interested in some post-processing
state and wanted to go ahead and release whatever managed resources were
used to get to that state.

I can't think of a class off the top of my head that uses Dispose() in
that way, but it wouldn't surprise me if it exists.

But this isn't a case where you need the GC to do the Dispose() for you.
It's more an example of explicitly controlling the lifetime of resources.

Pete
That might be a reason, but in my mind it would violate the intent of
IDisposable. Maybe I'm nitpicking here, but if I saw someone call
Dispose on an instance and then use that instance afterwards I'd be
seriously confused.
Disposing something tells me that the instance is left in an unknown
state and shouldn't be used.
I'd probably use some another, properly named method to provide this
kind of cleanup.

Kevin Wienhold

Sep 25 '07 #18

P: n/a
KWienhold <he******@trashmail.netwrote:
That might be a reason, but in my mind it would violate the intent of
IDisposable. Maybe I'm nitpicking here, but if I saw someone call
Dispose on an instance and then use that instance afterwards I'd be
seriously confused.
Disposing something tells me that the instance is left in an unknown
state and shouldn't be used.
Not necessarily. From the docs for IDisposable:

<quote>
Use this method to close or release unmanaged resources such as files,
streams, and handles held by an instance of the class that implements
this interface. This method is, by convention, used for all tasks
associated with freeing resources held by an object, or preparing an
object for reuse.
</quote>

Note the "or preparing an object for reuse". I can't think of any
examples where that actually *is* the use of IDisposable, but it would
be legal according to the docs.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 25 '07 #19

P: n/a

"Jon Skeet [C# MVP]" <sk***@pobox.comha scritto nel messaggio
news:MP*********************@msnews.microsoft.com. ..
KWienhold <he******@trashmail.netwrote:
>That might be a reason, but in my mind it would violate the intent of
IDisposable. Maybe I'm nitpicking here, but if I saw someone call
Dispose on an instance and then use that instance afterwards I'd be
seriously confused.
Disposing something tells me that the instance is left in an unknown
state and shouldn't be used.

Not necessarily. From the docs for IDisposable:

<quote>
Use this method to close or release unmanaged resources such as files,
streams, and handles held by an instance of the class that implements
this interface. This method is, by convention, used for all tasks
associated with freeing resources held by an object, or preparing an
object for reuse.
</quote>

Note the "or preparing an object for reuse". I can't think of any
examples where that actually *is* the use of IDisposable, but it would
be legal according to the docs.
Object pooling perhaps?
--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 25 '07 #20

P: n/a
Laura T. <LT*****@yahoo.comwrote:
Not necessarily. From the docs for IDisposable:

<quote>
Use this method to close or release unmanaged resources such as files,
streams, and handles held by an instance of the class that implements
this interface. This method is, by convention, used for all tasks
associated with freeing resources held by an object, or preparing an
object for reuse.
</quote>

Note the "or preparing an object for reuse". I can't think of any
examples where that actually *is* the use of IDisposable, but it would
be legal according to the docs.

Object pooling perhaps?
Yes, that could be a potential use - just one I happen not to have come
across :)

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 25 '07 #21

P: n/a
Hilton wrote:
"Göran Andersson" <gu***@guffa.comwrote in message
news:eU**************@TK2MSFTNGP05.phx.gbl...
>The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it in
a predictable way, but sometimes also to keep it alive in a predictable
way.

Just holding a reference to it will keep it alive as it always has.

Hilton
Just holding a reference is not enough. You have to actually use the
reference.

--
Göran Andersson
_____
http://www.guffa.com
Sep 25 '07 #22

P: n/a
On Sep 25, 4:06 am, Peter Duniho <NpOeStPe...@NnOwSlPiAnMk.comwrote:
Doug Semler wrote:
[...]
Whoa, wait. Finalize should NEVER call Dispose to clean up managed
resources, since the maaged resource references may no longer exist!

I don't disagree. But I was just assuming that we were talking about
unmanaged resources here. After all, those are the ones that cause the
most trouble if not disposed properly.

Did I miss something?
Huh..*we* may have known that, but some others lurking in the thread
may have interpreted the statement made to mean that it is safe to
call implement a Finalizer that calls Dispose() without regard to
whether the Finalize() method was doing the calling.

I actually like the C++/CLI's compiler's way of "automagically"
implementing the IDisposable pattern if you insert into your classes:

~ClassName()
{
// Dispose managed objects
this->!ClassName(); // call Finalizer
}

!ClassName()
{
// Clean up unmanaged stuff
}

The compiler is nice enough to implement IDisposable, Dispose(bool),
proper calling sequence, and just for good measure a
GC.SuppressFinalize in there on the explicit Dispose call <g>

Sep 25 '07 #23

P: n/a
>The GC handles cleaning up of managed resources just fine as far as I
>know, so I don't really understand why one would even use IDisposable
on a class containing only managed resources.
For any Compact Framework object containing an Image or a Bitmap it's really
useful, as those classes holdnative resources that you often want to get rid
of as soon as possible in a memory-constrained device.

Also maybe for controlling the lifetime of something in a using{...}
pattern? Maybe?
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

Sep 25 '07 #24

P: n/a
On Sep 25, 4:00 pm, "<ctacke/>" <ctacke[at]opennetcf[dot]comwrote:
The GC handles cleaning up of managed resources just fine as far as I
know, so I don't really understand why one would even use IDisposable
on a class containing only managed resources.

For any Compact Framework object containing an Image or a Bitmap it's really
useful, as those classes holdnative resources that you often want to get rid
of as soon as possible in a memory-constrained device.

Also maybe for controlling the lifetime of something in a using{...}
pattern? Maybe?
I would imagine that KWienhold meant that you don't (usually)
implement IDisposable on a class which only references managed
resources, directly or indirectly. If you've got a class which itself
"owns" some other IDisposables, that container class should also
implement IDisposable.

Jon

Sep 25 '07 #25

P: n/a
Certainly, but again for any lurkers I want it to be clear that there are
good reasons for a class that contains only managed classes to still
implement IDisposable.

I must say, cross-posting in groups other that just the CF (where I and the
OP typically reside) brings in some fresh perspectives and provides some
great insights. We should do this more often.

--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com
"Jon Skeet [C# MVP]" <sk***@pobox.comwrote in message
news:11**********************@g4g2000hsf.googlegro ups.com...
On Sep 25, 4:00 pm, "<ctacke/>" <ctacke[at]opennetcf[dot]comwrote:
>The GC handles cleaning up of managed resources just fine as far as I
know, so I don't really understand why one would even use IDisposable
on a class containing only managed resources.

For any Compact Framework object containing an Image or a Bitmap it's
really
useful, as those classes holdnative resources that you often want to get
rid
of as soon as possible in a memory-constrained device.

Also maybe for controlling the lifetime of something in a using{...}
pattern? Maybe?

I would imagine that KWienhold meant that you don't (usually)
implement IDisposable on a class which only references managed
resources, directly or indirectly. If you've got a class which itself
"owns" some other IDisposables, that container class should also
implement IDisposable.

Jon

Sep 25 '07 #26

P: n/a
Brian Gideon wrote:
On Sep 24, 4:48 pm, Göran Andersson <gu...@guffa.comwrote:
>The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it
in a predictable way, but sometimes also to keep it alive in a
predictable way.

That was a very well thought out response.
Thanks. :)

--
Göran Andersson
_____
http://www.guffa.com
Sep 25 '07 #27

P: n/a
On Sep 24, 10:35 pm, "Hilton" <nos...@nospam.comwrote:
"Göran Andersson" <gu...@guffa.comwrote in message

news:eU**************@TK2MSFTNGP05.phx.gbl...
The IDisposable interface is intended for the developer to be able to
control the life cycle of an object. That doesn't only mean to kill it in
a predictable way, but sometimes also to keep it alive in a predictable
way.

Just holding a reference to it will keep it alive as it always has.

Hilton
That's not really what he meant by "alive" though.

Sep 25 '07 #28

P: n/a
Hilton wrote:
"<ctacke/>" <ctacke[at]opennetcf[dot]comwrote in message
news:e%****************@TK2MSFTNGP06.phx.gbl...
>Agreed. Implementing IDisposable is a flag to the consumer, not the GC.
The GC doesn't know or care about IDisposable objects. If it did, then
you would have the issues currently surrounding Finalizers to content with
(and that's the exact reason it's recommended that you implement
IDisposable instead of using a Finalizer).

There's no way to "automate" this process without causing pain for those
of us who like performance or adding the requirement of multiple
collection cycles to release resources.

Sure there is. How would you call Dispose? You'd call it on an object -
right? That means that it would not be GC'd (yet) since you are holding a
reference to it, so everything works just fine as it always has.
Altually, holding a reference is not enough to keep an object alive. You
have to actually use the reference for it to count as an active
reference. As soon as the GC can determine that the reference will never
be used again, the object can be garbage collected.

With your suggestion, to keep an object alive you would have to call
GC.KeepAlive(obj) where you would now call obj.Dispose(). It's no less
work for the developer, and the GC would have to work a lot harder.
Now when
this object can be GC'd (no refs etc), then 'my' logic kicks in, disposes
the object and frees the memory ***IF*** it hasn't already been disposed.
It's just a safety net, not a new paradigm.

Your code is not affected, you can continue to optimize, but the GC will
dispose of any objects that need to be disposed, but weren't. Sounds good
to me.
A finalizer is usually used as a safety net in the Disposable pattern,
to at least try to free the unmanaged resources in case someone forgets
to call Dispose. The finalizer is never supposed to be used, however, as
it uses more resources than calling Dispose. In this pattern the Dispose
method calls GC.SupressFinalizer to remove the object from the
finalization list.

If the Dispose method is not called, the object remains in the
finalization list. When the GC is about to collect the object, it has to
put it in the FReachable queue instead, where a separate thread will
eventually execute the finalzer before the object can be collected. As
the object can't be collected right away, it might also have to be
promoted to the next GC generation, which means that the entire object
is moved in memory.

If you want to use the finalizer instead of the Disposable pattern, the
GC would have to become a lot more aggressive when running the
finalzers, starting a lot of threads with rather high priority to ensure
that it doesn't take too long before the finalizers are executed. This
of course means that the performance of an application would be much
less predictable, as all finalizers would have to run right away,
instead of when there is low pressure on the system.

--
Göran Andersson
_____
http://www.guffa.com
Sep 25 '07 #29

P: n/a

"Doug Semler" <do********@gmail.comwrote in message
news:11**********************@d55g2000hsg.googlegr oups.com...
On Sep 25, 4:06 am, Peter Duniho <NpOeStPe...@NnOwSlPiAnMk.comwrote:
>Doug Semler wrote:
[...]
Whoa, wait. Finalize should NEVER call Dispose to clean up managed
resources, since the maaged resource references may no longer exist!

I don't disagree. But I was just assuming that we were talking about
unmanaged resources here. After all, those are the ones that cause the
most trouble if not disposed properly.

Did I miss something?

Huh..*we* may have known that, but some others lurking in the thread
may have interpreted the statement made to mean that it is safe to
call implement a Finalizer that calls Dispose() without regard to
whether the Finalize() method was doing the calling.

I actually like the C++/CLI's compiler's way of "automagically"
implementing the IDisposable pattern if you insert into your classes:

~ClassName()
{
// Dispose managed objects
this->!ClassName(); // call Finalizer
}

!ClassName()
{
// Clean up unmanaged stuff
}

The compiler is nice enough to implement IDisposable, Dispose(bool),
proper calling sequence, and just for good measure a
GC.SuppressFinalize in there on the explicit Dispose call <g>
And automatically Dispose member objects (if declared without the tracking
handle modifier).
Sep 25 '07 #30

P: n/a
Jon,

Your post raises an interesting point which I have often thought about. If
a class implements IDisposable, the developer then has a license to later
add the use of unmanaged resources, and vice versa. For example, one Bitmap
construct uses managed resources and another uses unmanaged resources. But
ignore that for a minute, and let's say that Microsoft originally shipped
the Bitmap class with just one construct that used managed resources and
therefore the class did not (need to) implement IDisposable. Then later on,
they decide that the Bitmap class should use unmanaged resources throughout
for performance (even the construct that only used managed resources) -
they're screwed because exsisting apps won't know (or be able) to call
IDisposable without being recompiled and all apps that used the new .NET
would have this memory leak. Note: I'm just using a hypothetical Bitmap
class as an example here - let's not go off on a tangent speaking about the
actual Bitmap class. The point is that once a public library class does not
implement IDisposable to free unmanaged resources, it never can (and expect
the right things to happen).

Gee, I've just made a great argument in favor of the OP proposal. :)

Fire away.

Hilton

"Jon Skeet [C# MVP]" <sk***@pobox.comwrote in message
news:11**********************@g4g2000hsf.googlegro ups.com...
On Sep 25, 4:00 pm, "<ctacke/>" <ctacke[at]opennetcf[dot]comwrote:
>The GC handles cleaning up of managed resources just fine as far as I
know, so I don't really understand why one would even use IDisposable
on a class containing only managed resources.

For any Compact Framework object containing an Image or a Bitmap it's
really
useful, as those classes holdnative resources that you often want to get
rid
of as soon as possible in a memory-constrained device.

Also maybe for controlling the lifetime of something in a using{...}
pattern? Maybe?

I would imagine that KWienhold meant that you don't (usually)
implement IDisposable on a class which only references managed
resources, directly or indirectly. If you've got a class which itself
"owns" some other IDisposables, that container class should also
implement IDisposable.

Jon

Sep 26 '07 #31

P: n/a
On Sep 26, 6:15 am, "Hilton" <nos...@nospam.comwrote:
Your post raises an interesting point which I have often thought about. If
a class implements IDisposable, the developer then has a license to later
add the use of unmanaged resources, and vice versa. For example, one Bitmap
construct uses managed resources and another uses unmanaged resources. But
ignore that for a minute, and let's say that Microsoft originally shipped
the Bitmap class with just one construct that used managed resources and
therefore the class did not (need to) implement IDisposable. Then later on,
they decide that the Bitmap class should use unmanaged resources throughout
for performance (even the construct that only used managed resources) -
they're screwed because exsisting apps won't know (or be able) to call
IDisposable without being recompiled and all apps that used the new .NET
would have this memory leak. Note: I'm just using a hypothetical Bitmap
class as an example here - let's not go off on a tangent speaking about the
actual Bitmap class.
That would be a breaking change to the class, as far as I'm concerned.
They would shy away from that, creating another class instead.

Changing classes to implement IDisposable retrospectively is a major
versioning change, IMO.
The point is that once a public library class does not
implement IDisposable to free unmanaged resources, it never can (and expect
the right things to happen).
Indeed.
Gee, I've just made a great argument in favor of the OP proposal. :)
I don't see how you have. Why does this fact support the proposal?

Jon

Sep 26 '07 #32

P: n/a
On Sep 26, 12:15 am, "Hilton" <nos...@nospam.comwrote:
The point is that once a public library class does not
implement IDisposable to free unmanaged resources, it never can (and expect
the right things to happen).
Nevermind the fact that adding *any* interface after the fact could be
a version breaking change. That's why the Framework Design Guidelines
book recommends that you to choose them wisely right from the get go.

Sep 26 '07 #33

P: n/a
Jon wrote:
>Gee, I've just made a great argument in favor of the OP proposal. :)

I don't see how you have. Why does this fact support the proposal?
Because when the Bitmap get's GC'd, the GC will do a "if (o 'implements'
IDisposable && !o.Disposed) o.Dispose ()" and even those this Bitmap object
won't be Disposed deterministically, it will be Disposed at least when/if an
OutOfMemory exception gets thrown.

Please recall, I'm not saying that my proposal is perfect and that we should
change our code to use if (it requires no code change in the apps). All I'm
saying is that if the GC add the "if" as stated above, it would act as some
level of safety net and catch undisposed objects.

I think we're kidding ourselves to think that you, me, and all the other
..NET engineers in the world will always write code to do the right thing. I
really like "using ()" and use it all the time, but there are some objects
whose lives cannot be bottled up in a few lines. One little bug and whammo,
a LOT of memory potentially gets leaked. I cannot even count the times
people have posted here about being out of memory only to find that they
didn't know to Dispose a Bitmap object. The purist in you will jump up and
down and say "well they should have, they screwed up, and they didn't call
Dispose() - their fault.". Now if only we all lived in a perfect world
where one perfect engineer was solely responsible for one perfect project...
However, multiple engineers work on multiple projects and one missing
Dispose() can cost a company millions. Then when we get to finding the
problem, we're back to finding the problem-causing 'malloc-free' we're all
so fond of. :|

Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.

Hilton
Sep 26 '07 #34

P: n/a
Hilton <no****@nospam.comwrote:
Jon wrote:
Gee, I've just made a great argument in favor of the OP proposal. :)
I don't see how you have. Why does this fact support the proposal?

Because when the Bitmap get's GC'd, the GC will do a "if (o 'implements'
IDisposable && !o.Disposed) o.Dispose ()" and even those this Bitmap object
won't be Disposed deterministically, it will be Disposed at least when/if an
OutOfMemory exception gets thrown.
That's already the case, because Bitmap has a finalizer (via Image)
which calls Dispose.
Please recall, I'm not saying that my proposal is perfect and that we should
change our code to use if (it requires no code change in the apps). All I'm
saying is that if the GC add the "if" as stated above, it would act as some
level of safety net and catch undisposed objects.
Assuming that all classes which *directly* hold unmanaged resources
(and there should be very few of those) implement a finalizer
appropriately, I don't think your proposal adds anything.
I think we're kidding ourselves to think that you, me, and all the other
.NET engineers in the world will always write code to do the right thing. I
really like "using ()" and use it all the time, but there are some objects
whose lives cannot be bottled up in a few lines. One little bug and whammo,
a LOT of memory potentially gets leaked.
No, memory doesn't potentially get leaked - the GC handles that.
Unmanaged resources should be handled by the class which directly holds
them, and I'd hope that any developer writing such a class (I can't
remember the last time I did it myself) knows enough to implement a
finalizer.
I cannot even count the times
people have posted here about being out of memory only to find that they
didn't know to Dispose a Bitmap object.
I can't remember seeing that, except for knowing that if you have run
of out Windows handles, GDI+ throws an OutOfMemoryException which gives
the wrong impression. In that situation the GC won't have been called
anyway, so your proposal does no good.
The purist in you will jump up and
down and say "well they should have, they screwed up, and they didn't call
Dispose() - their fault.". Now if only we all lived in a perfect world
where one perfect engineer was solely responsible for one perfect project...
However, multiple engineers work on multiple projects and one missing
Dispose() can cost a company millions. Then when we get to finding the
problem, we're back to finding the problem-causing 'malloc-free' we're all
so fond of. :|

Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.
It already exists in the form of a finalizer.

What do you believe your proposal adds to the current finalization
strategy?

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 26 '07 #35

P: n/a
So you're suggesting that the model of a collection moves from this (these
are all CF GC, so not generational):

- Walk roots
- Mark all objects with a root
- Sweep all objects not marked and without finalizers
- move all non-marked objects with finalizers to freachable queue and
re-root
- Optionally compact heap
- Optionally shrink heap
- spawn finalization thread

to this:

- Walk roots
- Mark all objects with a root
- Sweep all objects not marked and without finalizers
- Call Dispose on non-marked, disposable items without finalizers
- move all non-marked objects with finalizers to freachable queue and
re-root
- Optionally compact heap
- Optionally shrink heap
- spawn finalization thread

So collection would have to wait for all Dispose calls to complete. Serious
perf impact there that I couln't accept.

Or this:
- Walk roots
- Mark all objects with a root
- Sweep all objects not marked and without finalizers
- move all non-marked IDisposable objects to some new "dispose queue" and
re-root
- move all non-marked objects with finalizers to freachable queue and
re-root
- Optionally compact heap
- Optionally shrink heap
- spawn Dispose thread
- spawn finalization thread

How do the Dispose and Finalizer thread make sure they don't have contention
or deadlocks when an object is IDisposable and has a Finalizer? Or would
all Dispose calls complete before the finalization thread starts? You
realize that this also means that anything that implements IDisposable will
require 2 collections to free (just like something with a Finalizer)?

With the current GC architecture:

If an object has unmanaged resources, it should have a finalizer that in
turn calls Dispose if necessary. So when the object is finalized, resources
are released. Your solution would not fix anything here as that finalizer
is already going to get called after the first GC following it losing all
roots. All managed resources will be released during the next after that.

If the object has managed resources and no finalizer, the GC will collect
them on the first GC following it losing all roots.

If the object has native resources and no finalizer, it's a
bug/implementation error.

The only possible benefit of your architecture would be to allow both
managed an unmanaged resources to be released in a single GC cycle if the
object had a finalizer, provided that the object's finalizer and Dispose
methods are written properly to take advantage of that. You already can
have that advantage by implementing the finalizer/Dispose properly and just
calling Dispose in your app.

Failure to call Dispose will not cause a memory leak (unless the Disposable
class has a bug in it), it simply shifts the time at which resources are
freed from being app controlled to GC controlled. The still get released in
either case. If the failure to call Dispose in an app causes a company to
lose millions of dollars, I'd posit that said company needs to re-visit how
its architecture could be so unbelievably broken as to allow such a thing.
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com


"Hilton" <no****@nospam.comwrote in message
news:lw****************@newssvr21.news.prodigy.net ...
Jon wrote:
>>Gee, I've just made a great argument in favor of the OP proposal. :)

I don't see how you have. Why does this fact support the proposal?

Because when the Bitmap get's GC'd, the GC will do a "if (o 'implements'
IDisposable && !o.Disposed) o.Dispose ()" and even those this Bitmap
object won't be Disposed deterministically, it will be Disposed at least
when/if an OutOfMemory exception gets thrown.

Please recall, I'm not saying that my proposal is perfect and that we
should change our code to use if (it requires no code change in the apps).
All I'm saying is that if the GC add the "if" as stated above, it would
act as some level of safety net and catch undisposed objects.

I think we're kidding ourselves to think that you, me, and all the other
.NET engineers in the world will always write code to do the right thing.
I really like "using ()" and use it all the time, but there are some
objects whose lives cannot be bottled up in a few lines. One little bug
and whammo, a LOT of memory potentially gets leaked. I cannot even count
the times people have posted here about being out of memory only to find
that they didn't know to Dispose a Bitmap object. The purist in you will
jump up and down and say "well they should have, they screwed up, and they
didn't call Dispose() - their fault.". Now if only we all lived in a
perfect world where one perfect engineer was solely responsible for one
perfect project... However, multiple engineers work on multiple projects
and one missing Dispose() can cost a company millions. Then when we get
to finding the problem, we're back to finding the problem-causing
'malloc-free' we're all so fond of. :|

Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.

Hilton


Sep 26 '07 #36

P: n/a
2. GC runs in a separate thread.

Not true in the case of the CF. GC runs in the context of the thread that
made the allocation that caused GC to occur. All other managed threads in
the AppDomain are suspended during the entire collection cycle (hence my
major concerns about performance of this suggestion).
3. You could quite easily lock the GC thread. A poorly-designed program
could easily lock on a mutex or event, or pop up a dialog, freezing the GC
thread permenantly or semi-permenantly.
Again, see my comment above.
>
In short ... leave it as-is ... IDisposable and finalizers were fairly
well thought-out and have specific reasons for existing and specific (and
different) limitations on them.
Agreed. Disposition is not the GC's job and it shouldn't be looking at it.
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

Sep 26 '07 #37

P: n/a
>I cannot even count the times
>people have posted here about being out of memory only to find that they
didn't know to Dispose a Bitmap object.

I can't remember seeing that, except for knowing that if you have run
of out Windows handles, GDI+ throws an OutOfMemoryException which gives
the wrong impression. In that situation the GC won't have been called
anyway, so your proposal does no good.
He's talking about the CF Bitmap, explained in depth here:
http://blogs.msdn.com/scottholden/ar...22/713056.aspx
And here:
http://blog.opennetcf.org/ctacke/Per...021225b74.aspx

And as I stated in the second link, I think it's a bug in the Bitmap (and
I've voiced the concern personally with Scott - not sure if CF 3.5 fixed
it). Altering the GC to fix an implementation bug would be just plain
silly.
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

Sep 26 '07 #38

P: n/a
"<ctacke/>" <ctacke[at]opennetcf[dot]comwrote in message
news:Ot**************@TK2MSFTNGP05.phx.gbl...
>2. GC runs in a separate thread.

Not true in the case of the CF. GC runs in the context of the thread that
made the allocation that caused GC to occur. All other managed threads in
the AppDomain are suspended during the entire collection cycle (hence my
major concerns about performance of this suggestion).
Fair enough. Still, since GC runs in a separate thread on the desktop, it's
not unreasonable to think that MS might change this design in the future for
the CF.

Robert
Sep 26 '07 #39

P: n/a
>
Fair enough. Still, since GC runs in a separate thread on the desktop,
it's not unreasonable to think that MS might change this design in the
future for the CF.
It's a very resonable assumption - in fact I hope the CF GC becomes more
like the desktop as versions progress. Your argument is still quite valid
in that it points out flaws with the proposal.
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

Sep 26 '07 #40

P: n/a
<"<ctacke/>" <ctacke[at]opennetcf[dot]com>wrote:
I cannot even count the times
people have posted here about being out of memory only to find that they
didn't know to Dispose a Bitmap object.
I can't remember seeing that, except for knowing that if you have run
of out Windows handles, GDI+ throws an OutOfMemoryException which gives
the wrong impression. In that situation the GC won't have been called
anyway, so your proposal does no good.

He's talking about the CF Bitmap, explained in depth here:
http://blogs.msdn.com/scottholden/ar...22/713056.aspx
And here:
http://blog.opennetcf.org/ctacke/Per...021225b74.aspx

And as I stated in the second link, I think it's a bug in the Bitmap (and
I've voiced the concern personally with Scott - not sure if CF 3.5 fixed
it). Altering the GC to fix an implementation bug would be just plain
silly.
Ah, right. I can't see how the OP's suggestion would help though, as it
sounds like the GC isn't running in this case anyway - it's failing to
allocate memory in the gwes.exe process.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 26 '07 #41

P: n/a

"Hilton" <no****@nospam.comha scritto nel messaggio
news:lw****************@newssvr21.news.prodigy.net ...
Jon wrote:
>>Gee, I've just made a great argument in favor of the OP proposal. :)

I don't see how you have. Why does this fact support the proposal?

Because when the Bitmap get's GC'd, the GC will do a "if (o 'implements'
IDisposable && !o.Disposed) o.Dispose ()" and even those this Bitmap
object won't be Disposed deterministically, it will be Disposed at least
when/if an OutOfMemory exception gets thrown.
First:

Are you aware thet there is no Disposed property in the IDisposable
interface:

// Summary:
// Defines a method to release allocated unmanaged resources.
[ComVisible(true)]
public interface IDisposable
{
// Summary:
// Performs application-defined tasks associated with freeing,
releasing, or
// resetting unmanaged resources.
void Dispose();
}

It would mean that runtime should take the step to record if the Dispose()
was called.. a big overhead, let me say it, for nothing.
CLR tracks already finalizers, now it should double track?

Second:

What if it Dispose() generates an exception? Was the object Disposed() or
not? Do we need to finalize or not?
Partially disposed objects are a nightmare.
Please recall, I'm not saying that my proposal is perfect and that we
should change our code to use if (it requires no code change in the apps).
All I'm saying is that if the GC add the "if" as stated above, it would
act as some level of safety net and catch undisposed objects.
It can't.. add the "if" as I said before. To support it, yes, it would
require code changes.
I think we're kidding ourselves to think that you, me, and all the other
.NET engineers in the world will always write code to do the right thing.
I really like "using ()" and use it all the time, but there are some
objects whose lives cannot be bottled up in a few lines. One little bug
and whammo,
Well, that's the life of sw engineers. One little bug (no finalizer when
there should be one) and..
You could blow up a nuclear power plant.. just a little bug.
CLR offers very good protection. Better than any other runtime. But it still
can't protect you from yourself.
a LOT of memory potentially gets leaked. I cannot even count the times
Leaked? Not if you are talking managed resources. Not in any case if you
have a finalizer,
and by saying finaliser I don't mean ~class() { // Gone for lunch };
people have posted here about being out of memory only to find that they
didn't know to Dispose a Bitmap object. The purist in you will jump up
and down and say "well they should have, they screwed up, and they didn't
call Dispose() - their fault.". Now if only we all lived in a perfect
world where one perfect engineer was solely responsible for one perfect
project... However, multiple engineers work on multiple projects and one
missing Dispose() can cost a company millions. Then when we get to
finding the
No, one missed finalizer can cost millions. One missed Dispose() could cost
a few bytes in a short time.
problem, we're back to finding the problem-causing 'malloc-free' we're all
so fond of. :|
Not really. There is no 'free' in C#.
Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.

Hilton
Now, ff you change the word Dispose to Finalize, you see that the framework
is already there.
The Dispose() is "user mode" and Finalize() is "kernel mode".. which one you
trust more?
You really need a double safety net?

Sep 26 '07 #42

P: n/a
Hilton wrote:
Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.
Adding another safety net doesn't make any application more robust. It
might only keep the application running a bit longer despite it's lack
of robustness.

--
Göran Andersson
_____
http://www.guffa.com
Sep 26 '07 #43

P: n/a
Brian Gideon wrote:
On Sep 26, 12:15 am, "Hilton" <nos...@nospam.comwrote:
>The point is that once a public library class does not
implement IDisposable to free unmanaged resources, it never can (and
expect
the right things to happen).

Nevermind the fact that adding *any* interface after the fact could be
a version breaking change. That's why the Framework Design Guidelines
book recommends that you to choose them wisely right from the get go.
And how can you decide today how you might optimize a class in 3 years time
when Microsoft release some new (unmanaged) technology that you could use?
In the extreme case, you'd have to implement IDisposable on practically
every class, or just get lucky.

Hilton
Sep 27 '07 #44

P: n/a
Jon Skeet wrote:
<"<ctacke/>" <ctacke[at]opennetcf[dot]com>wrote:
>I cannot even count the times
people have posted here about being out of memory only to find that
they
didn't know to Dispose a Bitmap object.

I can't remember seeing that, except for knowing that if you have run
of out Windows handles, GDI+ throws an OutOfMemoryException which gives
the wrong impression. In that situation the GC won't have been called
anyway, so your proposal does no good.

He's talking about the CF Bitmap, explained in depth here:
http://blogs.msdn.com/scottholden/ar...22/713056.aspx
And here:
http://blog.opennetcf.org/ctacke/Per...021225b74.aspx

And as I stated in the second link, I think it's a bug in the Bitmap (and
I've voiced the concern personally with Scott - not sure if CF 3.5 fixed
it). Altering the GC to fix an implementation bug would be just plain
silly.

Ah, right. I can't see how the OP's suggestion would help though, as it
sounds like the GC isn't running in this case anyway - it's failing to
allocate memory in the gwes.exe process.
IIRC, the GC runs when an OuotOfMemoryException is about to be thrown, so my
proposal would Dispose the Bitmaps perfectly and the app would never run out
of memory. Again, for the record, I'm not saying the app should expect
this. The engineer should call Dispose, etc etc etc.

BTW: The code is trivial to reproduce as per the link above just keep
creating Bitmaps and you'll get the OOM exception.

Hilton
Sep 27 '07 #45

P: n/a
And how can you decide today how you might optimize a class in 3 years
time when Microsoft release some new (unmanaged) technology that you could
use? In the extreme case, you'd have to implement IDisposable on
practically every class, or just get lucky.
Or you subclass. That's what OOP is about. Adding/changing an interface is
the same in my book as changing the name of every method in the class aloing
with the class itself. It's a major breaking change.
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

Sep 27 '07 #46

P: n/a
>
IIRC, the GC runs when an OuotOfMemoryException is about to be thrown, so
my proposal would Dispose the Bitmaps perfectly and the app would never
run out of memory. Again, for the record, I'm not saying the app should
expect this. The engineer should call Dispose, etc etc etc.
The GC does run. The problem with a CF Bitmap is that the GC runs, but the
finalizer thread hasn't completed, so you can get a second OOM if you
attempt to allocate again before it's done. Scott and I both covered very
well why it happens and how to get around it without seeing an OOM. The GC
should *not* be trying to fix this issue.
BTW: The code is trivial to reproduce as per the link above just keep
creating Bitmaps and you'll get the OOM exception.
Due to a bug in the Bitmap, not a fault in the GC. You keep trying to fix a
bug with a GC "workaround."
--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com

Sep 27 '07 #47

P: n/a
"Hilton" <no****@nospam.comwrote in message
news:lw****************@newssvr21.news.prodigy.net ...
>
Please recall, I'm not saying that my proposal is perfect and that we
should change our code to use if (it requires no code change in the apps).
All I'm saying is that if the GC add the "if" as stated above, it would
act as some level of safety net and catch undisposed objects.
I are just inventing an automatic finalizer for finalizers.
Have you considered the runtime cost?
>
I think we're kidding ourselves to think that you, me, and all the other
.NET engineers in the world will always write code to do the right thing.
Nope.
I really like "using ()" and use it all the time, but there are some
objects whose lives cannot be bottled up in a few lines.
That is a truism for most code written.
One little bug and whammo, a LOT of memory potentially gets leaked.
You should try coding in Delphi and C++.
I cannot even count the times people have posted here about being out of
memory only to find that they didn't know to Dispose a Bitmap object. The
purist in you will jump up and down and say "well they should have, they
screwed up, and they didn't call Dispose() - their fault.".
No, not purists. Realists. And really, - It's their fault! :)
Now if only we all lived in a perfect world where one perfect engineer was
solely responsible for one perfect project...
What would be a wonderful world.
However, multiple engineers work on multiple projects and one missing
Dispose() can cost a company millions.
A bad business descision can also cost a company millions.
Then when we get to finding the problem, we're back to finding the
problem-causing 'malloc-free' we're all so fond of. :|
That is why we use test-driven application development.
Also, remember it was Apollo 11 that landed on the moon, not Apollo 1.
Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.
Because for me it is the analogy of adding active armor to a car,
and encourage people to drive recklessly, as;
- Hey, if you crash into something, you just bounce off!

- Michael Starberg
Sep 27 '07 #48

This discussion thread is closed

Replies have been disabled for this discussion.