Hi folks.
So I'm in the middle of porting a large (previously COM-based) imaging
library to .Net. The clients of this library are VB programmers within the
company I work for. Their code will be getting ported to .Net at some point
in the near future too.
In the .Net world, many of my objects are forced to implement the
IDisposable+Finalizer pattern because they hold onto unmanaged resources or
represent resources where deterministic release is an absolute requirement.
For example, image data lives in memory blocks allocated with VirtualAlloc.
Image files are read and written using Stream-derived classes.
I'm just kinda pondering how the need for IDisposable is going to impact the
other guys in the company. This basically adds a programming requirement
that did not exist before in the environment that these guys are used to
(VB6+COM). For example if the guy building VB-based image objects on top of
my classes forgets to call Dispose on my image objects at the right times,
several to several dozen MB of memory allocated with VirtualAlloc will
remain committed, and, since that memory is unmanaged, this will not help
trigger appropriately timed GCs. Yes, that memory will get cleaned up when
my object is finalized, but for interactive operations where many images
come and go in a short period of time, this is a killer.
What I am thinking is that it is going to be a continual battle to get these
guys to into the habit of calling Dispose and hopefully of using try/finally
(whatever the equivalent is in VB). Also, they will need to implement
IDisposable on any of their own classes that hold refs to my objects.
I think this is a disaster waiting to happen. VB guys are in general just
not used to this sort of programming model (please no flames, when I say "VB
programmers" I mean "the VB programmers at my company" -- who mostly have no
clue about system-level programming issues). They will resist dealing with
IDisposable and the right things just won't happen when their code is first
ported. "IDisposable-correctness" is a bit like "const-correctness" in
C++ -- since it tends to permeate and percolate upwards through the design,
it is a real pain to retrofit.
I am wondering if there is anything I can do to help with this problem. I
mean, it seems to me that the compiler and the CLR know when a reference is
being "released" -- why isn't there a mechanism whereby an object can
receive notifications that references have been added or removed to itself?
If I had that, I could consider implementing some kind of auto-Dispose. Sort
of like an "operator Referenced" and "operator Released" -- maybe this is
the same thing as trying to overload operator= (which can't be done in C#)?
class MyClass {
private int RefCount;
void op_Referenced()
{
RefCount++;
}
void op_Released()
{
if(--RefCount == 0) Dispose();
}
}
void somecode()
{
MyClass p = new MyClass(); // call p.op_Referenced
p = null; // call p.op_Released
}
Am I missing anything existing in VS .Net 2003 that might help me here?
Maybe VB.Net does this already and this isn't the issue I think it is gonna
be? Am I just nuts?