By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
449,001 Members | 1,508 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 449,001 IT Pros & Developers. It's quick & easy.

RFC: Exceptions and control flow

P: n/a
Exceptions must not be used to control program flow. I intend to show that
this statement is flawed.

In some instances, exceptions may be used to control program flow in ways
that can lead to improved code readability and performance.

Consider an application that must eliminate duplicates in a list.

using system.collections;

//SETUP CODE initialize a list and provide a duplicate
using System.Collections;
ArrayList arr = new ArrayList(10);
for(int i = 0; i < 10;i++)
arr.Add(i.ToString());

//dupes
arr.Add(1.ToString());
//Exception to control program flow starts here
Hashtable ht = new Hashtable();
foreach(string str in new ArrayList(arr))
{
try
{
ht.Add(str, str);
}
catch(ArgumentException)
{
arr.Remove(str);
}
}
The algorithm essentially uses the unique constrains of the hashtable to
enforce uniqueness. Logically, the program flow is controlled by the
presence of exceptions.
This approach works well when there are minimal to moderate duplicates in
the list. The logic is also clear and concise and very efficient compared
with other contemporary
techniques to remove duplicates in a list which typically involve
duplicating and iterating a copy of the list.

Conclusion: It is not true that exceptions must not be used to control
program flow.

comments/queries/takers welcome.

--
Regards,
Alvin Bruney [Microsoft MVP ASP.NET]

[Shameless Author plug]
The Microsoft Office Web Components Black Book with .NET
Now Available @ http://www.lulu.com/owc
----------------------------------------------------------

Nov 16 '05 #1
Share this Question
Share on Google+
9 Replies


P: n/a
Alvin... I have violated this rule when the code worked, was efficient
and easy to read. However, I believe the argument not to use exceptions
to control program flow is valid in certain inner loops that result in a
lot of unwinding. I suspect that you could also use hasttable.contains
to write readable code.

Regards,
Jeff
Conclusion: It is not true that exceptions must not be used to control

program flow.<

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!
Nov 16 '05 #2

P: n/a
Well, everything I have seen from MS says that the performance hit in
raising an error is greater than the performance hit for testing for the
error condition so I thought that testing your example would show the same
thing. It did not always hold true:

In the code I list below, I looped Alvin's code 1000 times using the
try/catch and 1000 times testing for the dupe before adding to the
HashTable. In a couple dozen tests, the try/catch always took very close to
2 seconds to complete the loop while testing for the dupe took very close to
10 milliseconds... In that case, the performance penalty of the exception
was very big.

Then I moved the addition of the dupe outside of the 1000 evolution loop -
that way the error condition only occurs once out of the 1000 times rather
than every time. In this case, the try/catch took about 5 milliseconds and
testing for the dupe took the same 10 milliseconds. Using the try/catch was
faster.

Granted, these tests are not exactly scientific but the differences were so
significant and repeated relatively consistently over dozens of times in a
short time period, I believe they're pretty representative of the real
world, as the real world exists on my PC anyway.

My conclusion:

1. If the error condition is very unlikely or is going to happen only a
very tiny percentage (0.1% in my test) of the time, the try/catch can be
faster than testing for a condition every time, even when the likelihood of
that condition existing is very low.

2. In all other cases, where the error condition is even 1% likelihood or
greater, then the exception has a significant performance hit.

DalePres
MCAD, MCDBA, MCSE


//SETUP CODE initialize a list and provide a duplicate

ArrayList arr = new ArrayList(10);

for(int i = 0; i < 10;i++)

arr.Add(i.ToString());

DateTime start;

DateTime end;

start = DateTime.Now;

Console.WriteLine(start.ToString("HH:mm:ss.fff"));

for (int x = 0; x < 1000; x++)

{

//dupes

arr.Add(1.ToString());

//Exception to control program flow starts here

Hashtable ht = new Hashtable();

foreach(string str in new ArrayList(arr))

{

try

{

ht.Add(str, str);

}

catch(ArgumentException)

{

arr.Remove(str);

}

}

}

end = DateTime.Now;

Console.WriteLine(end.ToString("HH:mm:ss.fff"));

start = DateTime.Now;

Console.WriteLine(start.ToString("HH:mm:ss.fff"));

for (int x = 0; x < 1000; x++)

{

//dupes

arr.Add(1.ToString());

//Exception to control program flow starts here

Hashtable ht = new Hashtable();

foreach(string str in new ArrayList(arr))

{

if (ht[str] == null)

ht.Add(str, str);

else

arr.Remove(str);

}

}

end = DateTime.Now;

Console.WriteLine(end.ToString("HH:mm:ss.fff"));

Console.ReadLine();


"Alvin Bruney [MVP]" <vapor at steaming post office> wrote in message
news:Oo***************@TK2MSFTNGP09.phx.gbl...
Exceptions must not be used to control program flow. I intend to show that
this statement is flawed.

In some instances, exceptions may be used to control program flow in ways
that can lead to improved code readability and performance.

Nov 16 '05 #3

P: n/a
<"Alvin Bruney [MVP]" <vapor at steaming post office>> wrote:
Exceptions must not be used to control program flow. I intend to show that
this statement is flawed.

In some instances, exceptions may be used to control program flow in ways
that can lead to improved code readability and performance.

Consider an application that must eliminate duplicates in a list.

using system.collections;

//SETUP CODE initialize a list and provide a duplicate
using System.Collections;
ArrayList arr = new ArrayList(10);
for(int i = 0; i < 10;i++)
arr.Add(i.ToString());

//dupes
arr.Add(1.ToString());
//Exception to control program flow starts here
Hashtable ht = new Hashtable();
foreach(string str in new ArrayList(arr))
{
try
{
ht.Add(str, str);
}
catch(ArgumentException)
{
arr.Remove(str);
}
}
How is that better than:

foreach (string str in new ArrayList(arr))
{
// Have we seen this before?
if (ht.ContainsKey(str))
{
arr.Remove(str);
}
else
{
ht[str] = str;
}
}

<snip>
Conclusion: It is not true that exceptions must not be used to control
program flow.


I agree with your conclusion, but I don't think your example is a
particularly good one. The places where I go against this rule (and
always feel bad doing it, to be honest) is where I basically need to
jump a couple of levels up the stack. For instance, testing lots of
different things:

try
{
TestFoo(1, 2, 3);
TestFoo(2, 3, 4);
TestBar();
TestFoo(5, 6, 7);
return true;
}
catch
{
return false;
}

where each of the Test methods throws an exception if the test fails
feels a lot less fragile (and clearer) than:

if (!TestFoo(1, 2, 3))
{
return false;
}
if (!TestFoo(2, 3, 4))
{
return false;
}
if (!TestBar())
{
return false;
}
if (!TestFoo(5, 6, 7))
{
return false;
}
return true;

(Admittedly in this case it could be reasonably reduced to a compound
"if", but in real life code the test lines are often more complicated,
leading to a huge and unreadable test when combined.)

This is only appropriate at all when the test methods are *solely*
there for the above purpose though - and that should be very clearly
documented.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #4

P: n/a
DalePres <don-t-spa-m-me@lea-ve-me-a-lone--.com> wrote:
Well, everything I have seen from MS says that the performance hit in
raising an error is greater than the performance hit for testing for the
error condition so I thought that testing your example would show the same
thing. It did not always hold true:

In the code I list below, I looped Alvin's code 1000 times using the
try/catch and 1000 times testing for the dupe before adding to the
HashTable. In a couple dozen tests, the try/catch always took very close to
2 seconds to complete the loop while testing for the dupe took very close to
10 milliseconds... In that case, the performance penalty of the exception
was very big.
Were you running under the debugger by any chance? When running under
the debugger, the first exception thrown takes a couple of seconds, and
the rest are reasonably quick. This one-time performance penalty
doesn't occur under release.

On my laptop I can loop the code 100,000 times and still end up
finishing in about 3 seconds with exceptions. This is admittedly nearly
ten times slower than without exceptions, but the difference is much
closer. And don't forget, very few places in code end up being
bottlenecks.
Granted, these tests are not exactly scientific but the differences were so
significant and repeated relatively consistently over dozens of times in a
short time period, I believe they're pretty representative of the real
world, as the real world exists on my PC anyway.


If I'm right and you were running under the debugger, they aren't
representative of real world applications, which don't run under the
debugger.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #5

P: n/a
I think that the performance issues still make exception testing faster than
exception raising even when you're talking about going back several levels
in code.

The primary consideration I use if I do consider using exceptions to control
program flow is cost to the client. If the programming cost of developing a
robust exception test is greater than the cost of a often-times
insignificant overall user interface performance degradation, then use an
exception to control expected program flow.

Of course, I don't have to say this to all you MVP's but I will say it for
any new C# programmers reading the thread: This whole question is about
handling known or expected exceptions. You still want to use
try/catch/finally loops liberally in your code where there exists a chance
of an unexpected exception. Letting an unhandled exception get to your
users is always a bigger performance hit than all the other options.

IMHO
DalePres
MCAD, MCDBA, MCSE
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<"Alvin Bruney [MVP]" <vapor at steaming post office>> wrote:
Exceptions must not be used to control program flow. I intend to show
that
this statement is flawed.

In some instances, exceptions may be used to control program flow in ways
that can lead to improved code readability and performance.

Consider an application that must eliminate duplicates in a list.

using system.collections;

//SETUP CODE initialize a list and provide a duplicate
using System.Collections;
ArrayList arr = new ArrayList(10);
for(int i = 0; i < 10;i++)
arr.Add(i.ToString());

//dupes
arr.Add(1.ToString());
//Exception to control program flow starts here
Hashtable ht = new Hashtable();
foreach(string str in new ArrayList(arr))
{
try
{
ht.Add(str, str);
}
catch(ArgumentException)
{
arr.Remove(str);
}
}


How is that better than:

foreach (string str in new ArrayList(arr))
{
// Have we seen this before?
if (ht.ContainsKey(str))
{
arr.Remove(str);
}
else
{
ht[str] = str;
}
}

<snip>
Conclusion: It is not true that exceptions must not be used to control
program flow.


I agree with your conclusion, but I don't think your example is a
particularly good one. The places where I go against this rule (and
always feel bad doing it, to be honest) is where I basically need to
jump a couple of levels up the stack. For instance, testing lots of
different things:

try
{
TestFoo(1, 2, 3);
TestFoo(2, 3, 4);
TestBar();
TestFoo(5, 6, 7);
return true;
}
catch
{
return false;
}

where each of the Test methods throws an exception if the test fails
feels a lot less fragile (and clearer) than:

if (!TestFoo(1, 2, 3))
{
return false;
}
if (!TestFoo(2, 3, 4))
{
return false;
}
if (!TestBar())
{
return false;
}
if (!TestFoo(5, 6, 7))
{
return false;
}
return true;

(Admittedly in this case it could be reasonably reduced to a compound
"if", but in real life code the test lines are often more complicated,
leading to a huge and unreadable test when combined.)

This is only appropriate at all when the test methods are *solely*
there for the above purpose though - and that should be very clearly
documented.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 16 '05 #6

P: n/a
I did the tests below starting the app within the IDE but using Ctrl-F5 to
start without debugging. You are right, of course, that debugging causes a
significant hit on the first exception but I don't know if it applies as I
was running or not.

Just to be sure though, I did run the exe from outside of the environment
and got similar results to yours, 70 ms with the exception, 10ms testing for
the duplicate.

So maybe instead of a 1% crossover point in the performance equation, it is
a 10% crossover point.

DalePres
Nov 16 '05 #7

P: n/a
DalePres <don-t-spa-m-me@lea-ve-me-a-lone--.com> wrote:
I think that the performance issues still make exception testing faster than
exception raising even when you're talking about going back several levels
in code.
Using a test will usually be faster than raising an exception, but
what's important is how much faster in the real world - how much faster
per operation, and how frequently the operation occurs. Very little
code is actually performance critical, and readability should be the
most important test.
The primary consideration I use if I do consider using exceptions to control
program flow is cost to the client. If the programming cost of developing a
robust exception test is greater than the cost of a often-times
insignificant overall user interface performance degradation, then use an
exception to control expected program flow.

Of course, I don't have to say this to all you MVP's but I will say it for
any new C# programmers reading the thread: This whole question is about
handling known or expected exceptions. You still want to use
try/catch/finally loops liberally in your code where there exists a chance
of an unexpected exception. Letting an unhandled exception get to your
users is always a bigger performance hit than all the other options.


That doesn't mean using lots of try/catch/finally blocks though - just
having a top-level one is often good enough.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #8

P: n/a
Initially, I thought your case is more expensive because you always do a
look up even when it is not needed. Hashtables are optimized for retrievals
but there is still the overhead of the actual lookup.
However, the actually timed results does show you have faster code.
grrrrrrrrrrrrrrrrrrrr!!!
Release mode results

yours
150.216
130.1872
150.216

mine
170.2448
160.2334
150.216

test code
ArrayList arr = new ArrayList(10);

for(int i = 0; i < 100000;i++)

arr.Add(i.ToString());

arr.Add(1.ToString());

arr.Add(2.ToString());

arr.Add(3.ToString());

DateTime start = DateTime.Now;

Hashtable ht = new Hashtable();

foreach(string str in new ArrayList(arr))

{

try

{

ht.Add(str, str);

}

catch(ArgumentException)

{

arr.Remove(str);

}

}

// Have we seen this before?

// if (ht.ContainsKey(str))

// {

// arr.Remove(str);

// }

// else

// {

// ht[str] = str;

// }

// }
DateTime stop = DateTime.Now;

TimeSpan diff = stop.Subtract(start);
Console.WriteLine(diff.TotalMilliseconds);
--
Regards,
Alvin Bruney [Microsoft MVP ASP.NET]

[Shameless Author plug]
The Microsoft Office Web Components Black Book with .NET
Now Available @ http://www.lulu.com/owc
----------------------------------------------------------
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<"Alvin Bruney [MVP]" <vapor at steaming post office>> wrote:
Exceptions must not be used to control program flow. I intend to show
that
this statement is flawed.

In some instances, exceptions may be used to control program flow in ways
that can lead to improved code readability and performance.

Consider an application that must eliminate duplicates in a list.

using system.collections;

//SETUP CODE initialize a list and provide a duplicate
using System.Collections;
ArrayList arr = new ArrayList(10);
for(int i = 0; i < 10;i++)
arr.Add(i.ToString());

//dupes
arr.Add(1.ToString());
//Exception to control program flow starts here
Hashtable ht = new Hashtable();
foreach(string str in new ArrayList(arr))
{
try
{
ht.Add(str, str);
}
catch(ArgumentException)
{
arr.Remove(str);
}
}


How is that better than:

foreach (string str in new ArrayList(arr))
{
// Have we seen this before?
if (ht.ContainsKey(str))
{
arr.Remove(str);
}
else
{
ht[str] = str;
}
}

<snip>
Conclusion: It is not true that exceptions must not be used to control
program flow.


I agree with your conclusion, but I don't think your example is a
particularly good one. The places where I go against this rule (and
always feel bad doing it, to be honest) is where I basically need to
jump a couple of levels up the stack. For instance, testing lots of
different things:

try
{
TestFoo(1, 2, 3);
TestFoo(2, 3, 4);
TestBar();
TestFoo(5, 6, 7);
return true;
}
catch
{
return false;
}

where each of the Test methods throws an exception if the test fails
feels a lot less fragile (and clearer) than:

if (!TestFoo(1, 2, 3))
{
return false;
}
if (!TestFoo(2, 3, 4))
{
return false;
}
if (!TestBar())
{
return false;
}
if (!TestFoo(5, 6, 7))
{
return false;
}
return true;

(Admittedly in this case it could be reasonably reduced to a compound
"if", but in real life code the test lines are often more complicated,
leading to a huge and unreadable test when combined.)

This is only appropriate at all when the test methods are *solely*
there for the above purpose though - and that should be very clearly
documented.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 16 '05 #9

P: n/a
Conclusion: It is not true that exceptions must not be used to control
program flow.


Related to the current framework, your tests are true.

However in my opinion is not said with that, that Microsoft or other
developpers from a framework should take the same approach as Microsoft does
now or that Microsoft has to do that in future.

What means for me that this is the same as using a bug.

Although I have the idea that Microsoft developpers use that in my opinion
as well. By instance with a non existing relation in the databinding, if
that is a valid situation (at databinding to a non existing child). However,
they can change that direct when the Microsoft Framework changes, so we can
think in another way about that.

Just my thought,

Cor
Nov 16 '05 #10

This discussion thread is closed

Replies have been disabled for this discussion.