On Wed, 15 Oct 2008 00:12:52 -0700, Peter Morris
<mr*********@spamgmail.comwrote:
Hi Pete
This morning I changed some code. I previously had a
Queue<IFileProcessorwhich I checked in a thread and called Execute()
on each instance as I removed it. I changed this into a simple class
which has a SyncRoot which I can Wait and Pulse, and then some code to
process the queue.
[...]
Much happier now :-)
Cool...glad the info helped. The new version looks good.
One thing I noticed in your code though, in your desire to move the actual
processing outside of the lock (a good idea), you have introduced an
inefficiency in the synchronization (not so good).
Specifically, in the scenario in which your thread is blocked waiting, if
something is enqueued, the very first thing that the thread does after
being woken up again is to release the lock and then try to acquire it
again. That's inefficient in and of itself, but it gets really bad if
there's another thread also trying to get the lock at the same time. The
thread you just woke up then has to go right back to the waiting state
until the lock gets released again.
On a single-CPU system this winds up being not _quite_ so bad, because
there's a good chance your thread will get back to the reacquisition of
the lock before it uses up its quantum (in fact, in the code you posted,
it's practically guaranteed). But on a multi-core system, it's entirely
possible that even in that short period of time, some other thread can
grab the lock.
If these tasks don't come up very often, that's probably okay. The
inefficiency won't matter. But in a high-throughput system, lock
contention and context switching can be expensive.
If you think the inefficiency might actually be an issue, there are surely
even higher-performance implementations a true threading expert could come
up with, but at the very least, you might want to consider something more
like this:
private void ProcessQueue()
{
lock (SyncRoot)
{
while (true)
{
if (Queue.Count 0)
{
IFileProcessor processor = Queue.Dequeue();
Monitor.Leave(SyncRoot);
processor.Execute();
Monitor.Enter(SyncRoot);
if (processor.State == FileProcessState.Ready)
{
Enqueue(processor);
}
}
else
{
Monitor.Wait(SyncRoot);
}
}
}
}
This allows the thread to retain the lock all the way from being woken up
until it goes to check the queue again.
You may be able to gain even more efficiency by emptying the queue all at
once before releasing the lock, putting each dequeued IFileProcessor
instance into a temporary array. That will tend to minimize the number of
times you actually have to release and then reacquire the lock. (Though,
again...this would only matter in a high-throughput situation). For
example, changing the above if() and true clause to:
if (Queue.Count 0)
{
IFileProcessor[] processors = Queue.ToArray();
List<IFileProcessorreadyList = new
List<IFileProcessor>(processors.Length);
Queue.Clear();
Monitor.Leave(SyncRoot);
foreach (IFileProcessor processor in processors)
{
processor.Execute();
if (processor.State == FileProcessState.Ready)
{
readyList.Add(processor);
}
}
Monitor.Enter(SyncRoot);
foreach (IFileProcessor processor in readyList)
{
Enqueue(processor);
}
}
That way, the thread grabs as much of the current state as it can before
letting go of the lock. It also queues all the new "ready" things it
knows about all at once.
Finally, if you want to get really picky, at least from the code you
posted it appears that pushing the new "ready" items back through the
queue is wasteful. Especially with the above alternative, you could
easily maintain a local Queue to which all of the public Queue elements
are copied, and then just loop while unlocked until the local Queue is
empty, adding the "ready" items back into the local Queue as you find
them. (Of course, you'd only want to do that if a) you can guarantee that
doing so won't starve new items being added to the public queue during the
processing of the local queue, or b) that starving those items won't
matter...it would essentially boost the priority of the items that remain
ready after processing above any newly added items that haven't been
processed yet).
Thanks for sharing...it's these practical "code in the wild" examples that
I think really help illustrate the different approaches available in C#
for specific tasks. There's nothing quite like a real-world case study.
:)
Pete