By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,649 Members | 2,142 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,649 IT Pros & Developers. It's quick & easy.

performance vs. readability/maintability

P: n/a
In another online group in which I participate, we were discussing a
particular piece of code that had a pretty high risk for breaking in the
future (because it depended on something not changing that was outside the
developer's control) but was slightly more performant. One participant
posted:

"I tend to take the performance track also, adding readability if the impact
isn't too great. There is also an odd reality that takes place even in the
software field however. Not sure I can explain it too well but simply said,
if you write code to handle the effects of changing the ordinal position of
a field, some other error will surface anyway."

which I found to be ridiculous to the point of being dangerous.

I know this is somewhat one of those opinion issues, but I just wondered
where everyone fell on the continuum.
Nov 15 '05 #1
Share this Question
Share on Google+
10 Replies


P: n/a
Personally, I prefer to write code that's readable and maintainable first
(modulo obvious perf wins like StringBuilder and DataReader in some
scenarios). After the code works, use a profiler to look for performance
bottlenecks. Having said that, I've often built prototypes for the purpose
of profiling perf . It's my experience that most people that obsess over
perf without profiling optimize in the wrong places and have butt-ugly
non-maintainable code.

--
Mickey Williams
Author, "Microsoft Visual C# .NET Core Reference", MS Press
www.servergeek.com
"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> wrote in message
news:e5**************@TK2MSFTNGP11.phx.gbl...
In another online group in which I participate, we were discussing a
particular piece of code that had a pretty high risk for breaking in the
future (because it depended on something not changing that was outside the
developer's control) but was slightly more performant. One participant
posted:

"I tend to take the performance track also, adding readability if the impact isn't too great. There is also an odd reality that takes place even in the software field however. Not sure I can explain it too well but simply said, if you write code to handle the effects of changing the ordinal position of a field, some other error will surface anyway."

which I found to be ridiculous to the point of being dangerous.

I know this is somewhat one of those opinion issues, but I just wondered
where everyone fell on the continuum.

Nov 15 '05 #2

P: n/a
For what it's worth I would almost always side with reliability. I hate
fixing code, especially when it's due to a bad decision on my behalf. I
suppose that if I were in a situation where some code absolutely had to run
faster (show-stopper), and the modification was the only way to get over the
line, and the reliability problem was not random, but a maintenance issue,
(only a bug if the table structure changes), and the reliability problem
would not cause more damage than running too slow..

Then maybe I would bring it up in a meeting, just so that my coworkers could
talk me out of it.

Regards

"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> wrote in message
news:e5**************@TK2MSFTNGP11.phx.gbl...
In another online group in which I participate, we were discussing a
particular piece of code that had a pretty high risk for breaking in the
future (because it depended on something not changing that was outside the
developer's control) but was slightly more performant. One participant
posted:

"I tend to take the performance track also, adding readability if the impact isn't too great. There is also an odd reality that takes place even in the software field however. Not sure I can explain it too well but simply said, if you write code to handle the effects of changing the ordinal position of a field, some other error will surface anyway."

which I found to be ridiculous to the point of being dangerous.

I know this is somewhat one of those opinion issues, but I just wondered
where everyone fell on the continuum.

Nov 15 '05 #3

P: n/a
Daniel Billingsley <db**********@NO.durcon.SPAAMM.com> wrote:

<snip>
I know this is somewhat one of those opinion issues, but I just wondered
where everyone fell on the continuum.


As I suspect many readers know already, I would code for readability
first, performance later, almost always.

Having said that, I will of course use a StringBuilder to create a
string over the course of a loop etc.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 15 '05 #4

P: n/a
I agree with you. Sacrificing readability and reliability for a negligent
gain in performance _is_ ludicrous.

I am only slightly more flexible with the readability part. I will confess
that there have been a few times in my career where I had to sacrifice some
readability in order to eak out some extra performance.

In all cases, it was a situation where a perf deficit was magnified because
the code was either in a loop or was called extremely often, and where the
perf difference was vital to the acceptability of the application. (where
the perf difference was meaningful)

Beyond those exceptional instances, it seems to always pay off to go with
maintainability and reliability. You might get some quick notoriety if the
application runs a little faster, but in the long run, customers always seem
to appreciate the app that never crashes/never convolutes or loses their
data/etc over the one that runs really fast when it works at all.

"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> wrote in message
news:ew**************@TK2MSFTNGP11.phx.gbl...
Yeah the quote I posted actually makes two points

1) readability vs. performance
2) reliability vs. performance

with the author says he chooses performance in both cases.

My basic position tends to be that in most cases the performance differences we're talking about our trivial (the tests used to demonstrate them often
use 10,000,000 iterations and our code is doing exactly one per hour, for
example). I can't see any wisdom whatsoever in obsessing over such trivial performance gains and sacrificing what may well be hours of future work on
the code.

As far as the reliability, I think the position of writing code that is
pretty likely to break some day because it performs slightly better is just ludicrous. In this particular case, it was in the context of database
access, which I think will far overshadow any milliseconds saved here or
there by writing theoretically perfect code.

"J.Marsch" <je****@ctcdeveloper.com> wrote in message
news:ek*************@tk2msftngp13.phx.gbl...
For what it's worth I would almost always side with reliability. I hate
fixing code, especially when it's due to a bad decision on my behalf. I
suppose that if I were in a situation where some code absolutely had to

run
faster (show-stopper), and the modification was the only way to get over

the
line, and the reliability problem was not random, but a maintenance issue, (only a bug if the table structure changes), and the reliability problem
would not cause more damage than running too slow..

Then maybe I would bring it up in a meeting, just so that my coworkers

could
talk me out of it.

Regards


Nov 15 '05 #5

P: n/a
I'll follow the trend in the other replies: Readability and maintainability
have highest priority.

Like Mickey, I will only sacrifice readability for performance if I have
identified the performance bottleneck with a profiler and if there is no
other way to improve it.

I don't like the idea of introducing low-level optimization hacks "a
priori". I cannot find one example where this approach really worked and
brought benefits, but lots of examples where it did not work at all. On the
other hand, spending time on choosing the right data structures (so that
everything that you will access over and over will be properly indexed in
your object graphs) and the right algorithms really pays off.

Also, I am very careful in the way I use the language constructs, trying to
make the code as easy to read as possible, and as robust as possible, and
usually, I can find elegant solutions that don't conflict with the
performance goal. They may not give the absolute best performance
(otherwise, I would be writing in C, or even worse, assembly), but they
usually give a very good balance between clarity and performance.

Bruno.

"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> a écrit dans le
message de news:e5**************@TK2MSFTNGP11.phx.gbl...
In another online group in which I participate, we were discussing a
particular piece of code that had a pretty high risk for breaking in the
future (because it depended on something not changing that was outside the
developer's control) but was slightly more performant. One participant
posted:

"I tend to take the performance track also, adding readability if the impact isn't too great. There is also an odd reality that takes place even in the software field however. Not sure I can explain it too well but simply said, if you write code to handle the effects of changing the ordinal position of a field, some other error will surface anyway."

which I found to be ridiculous to the point of being dangerous.

I know this is somewhat one of those opinion issues, but I just wondered
where everyone fell on the continuum.

Nov 15 '05 #6

P: n/a
Yes. And I would certainly not hire the guy who wrote the original post and
believes in writing code that depends on the ordinal position of fields. No
way!

Bruno.

"J.Marsch" <je****@ctcdeveloper.com> a écrit dans le message de
news:et**************@TK2MSFTNGP09.phx.gbl...
I agree with you. Sacrificing readability and reliability for a negligent
gain in performance _is_ ludicrous.

I am only slightly more flexible with the readability part. I will confess that there have been a few times in my career where I had to sacrifice some readability in order to eak out some extra performance.

In all cases, it was a situation where a perf deficit was magnified because the code was either in a loop or was called extremely often, and where the
perf difference was vital to the acceptability of the application. (where
the perf difference was meaningful)

Beyond those exceptional instances, it seems to always pay off to go with
maintainability and reliability. You might get some quick notoriety if the application runs a little faster, but in the long run, customers always seem to appreciate the app that never crashes/never convolutes or loses their
data/etc over the one that runs really fast when it works at all.

"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> wrote in message
news:ew**************@TK2MSFTNGP11.phx.gbl...
Yeah the quote I posted actually makes two points

1) readability vs. performance
2) reliability vs. performance

with the author says he chooses performance in both cases.

My basic position tends to be that in most cases the performance

differences
we're talking about our trivial (the tests used to demonstrate them often
use 10,000,000 iterations and our code is doing exactly one per hour, for example). I can't see any wisdom whatsoever in obsessing over such

trivial
performance gains and sacrificing what may well be hours of future work on the code.

As far as the reliability, I think the position of writing code that is
pretty likely to break some day because it performs slightly better is

just
ludicrous. In this particular case, it was in the context of database
access, which I think will far overshadow any milliseconds saved here or
there by writing theoretically perfect code.

"J.Marsch" <je****@ctcdeveloper.com> wrote in message
news:ek*************@tk2msftngp13.phx.gbl...
For what it's worth I would almost always side with reliability. I hate fixing code, especially when it's due to a bad decision on my behalf. I suppose that if I were in a situation where some code absolutely had to
run
faster (show-stopper), and the modification was the only way to get
over the
line, and the reliability problem was not random, but a maintenance

issue, (only a bug if the table structure changes), and the reliability

problem would not cause more damage than running too slow..

Then maybe I would bring it up in a meeting, just so that my coworkers

could
talk me out of it.

Regards



Nov 15 '05 #7

P: n/a
Daniel Billingsley <db**********@NO.durcon.SPAAMM.com> wrote:
Yeah the quote I posted actually makes two points

1) readability vs. performance
2) reliability vs. performance

with the author says he chooses performance in both cases.
Good grief.
My basic position tends to be that in most cases the performance differences
we're talking about our trivial (the tests used to demonstrate them often
use 10,000,000 iterations and our code is doing exactly one per hour, for
example). I can't see any wisdom whatsoever in obsessing over such trivial
performance gains and sacrificing what may well be hours of future work on
the code.
Agreed.
As far as the reliability, I think the position of writing code that is
pretty likely to break some day because it performs slightly better is just
ludicrous. In this particular case, it was in the context of database
access, which I think will far overshadow any milliseconds saved here or
there by writing theoretically perfect code.


Yup. This is the problem I have with the oft-quoted performance article
(http://tinyurl.com/hxo2) which includes the following:

<quote>
Don't do it. Instead, stand up and pledge along with me:

"I promise I will not ship slow code. Speed is a feature I care
about. Every day I will pay attention to the performance of my code. I
will regularly and methodically measure its speed and size. I will
learn, build, or buy the tools I need to do this. It's my
responsibility."

(Really.) So did you promise? Good for you.

So how do you write the fastest, tightest code day in and day out? It
is a matter of consciously choosing the frugal way in preference to the
extravagant, bloated way, again and again, and a matter of thinking
through the consequences. Any given page of code captures dozens of
such small decisions.
</quote>

I don't *want* to write the fastest, tightest code. I want to write
reliable, maintainable code, which performs *well enough*.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 15 '05 #8

P: n/a
No kidding! That's just bad form.

"Bruno Jouhier [MVP]" <bj******@club-internet.fr> wrote in message
news:uk**************@tk2msftngp13.phx.gbl...
Yes. And I would certainly not hire the guy who wrote the original post and believes in writing code that depends on the ordinal position of fields. No way!

Bruno.

"J.Marsch" <je****@ctcdeveloper.com> a écrit dans le message de
news:et**************@TK2MSFTNGP09.phx.gbl...
I agree with you. Sacrificing readability and reliability for a negligent
gain in performance _is_ ludicrous.

I am only slightly more flexible with the readability part. I will confess
that there have been a few times in my career where I had to sacrifice

some
readability in order to eak out some extra performance.

In all cases, it was a situation where a perf deficit was magnified

because
the code was either in a loop or was called extremely often, and where the perf difference was vital to the acceptability of the application. (where the perf difference was meaningful)

Beyond those exceptional instances, it seems to always pay off to go with maintainability and reliability. You might get some quick notoriety if

the
application runs a little faster, but in the long run, customers always

seem
to appreciate the app that never crashes/never convolutes or loses their
data/etc over the one that runs really fast when it works at all.

"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> wrote in message news:ew**************@TK2MSFTNGP11.phx.gbl...
Yeah the quote I posted actually makes two points

1) readability vs. performance
2) reliability vs. performance

with the author says he chooses performance in both cases.

My basic position tends to be that in most cases the performance

differences
we're talking about our trivial (the tests used to demonstrate them

often use 10,000,000 iterations and our code is doing exactly one per hour, for example). I can't see any wisdom whatsoever in obsessing over such

trivial
performance gains and sacrificing what may well be hours of future work on
the code.

As far as the reliability, I think the position of writing code that
is pretty likely to break some day because it performs slightly better is just
ludicrous. In this particular case, it was in the context of database
access, which I think will far overshadow any milliseconds saved here or there by writing theoretically perfect code.

"J.Marsch" <je****@ctcdeveloper.com> wrote in message
news:ek*************@tk2msftngp13.phx.gbl...
> For what it's worth I would almost always side with reliability. I

hate > fixing code, especially when it's due to a bad decision on my behalf. I
> suppose that if I were in a situation where some code absolutely had to run
> faster (show-stopper), and the modification was the only way to get over the
> line, and the reliability problem was not random, but a maintenance

issue,
> (only a bug if the table structure changes), and the reliability problem > would not cause more damage than running too slow..
>
> Then maybe I would bring it up in a meeting, just so that my

coworkers could
> talk me out of it.
>
> Regards



Nov 15 '05 #9

P: n/a
I wholeheartedly agree with your comments about using the right algorithm
and the right data structure. I find that, alot of the time, if some bit of
code isn't performing at expectations, it's because the algorithm or
datastructures are not well matched to the task. In such a situation, you
can often come out with a solution that is _more_ readable, more
maintainable, and more performant!

I side with Einstein. He said that when you find a true answer to a mystery
of the universe, that solution will be simple and elegant.

"Bruno Jouhier [MVP]" <bj******@club-internet.fr> wrote in message
news:eu**************@TK2MSFTNGP11.phx.gbl...
I'll follow the trend in the other replies: Readability and maintainability have highest priority.

Like Mickey, I will only sacrifice readability for performance if I have
identified the performance bottleneck with a profiler and if there is no
other way to improve it.

I don't like the idea of introducing low-level optimization hacks "a
priori". I cannot find one example where this approach really worked and
brought benefits, but lots of examples where it did not work at all. On the other hand, spending time on choosing the right data structures (so that
everything that you will access over and over will be properly indexed in
your object graphs) and the right algorithms really pays off.

Also, I am very careful in the way I use the language constructs, trying to make the code as easy to read as possible, and as robust as possible, and
usually, I can find elegant solutions that don't conflict with the
performance goal. They may not give the absolute best performance
(otherwise, I would be writing in C, or even worse, assembly), but they
usually give a very good balance between clarity and performance.

Bruno.

"Daniel Billingsley" <db**********@NO.durcon.SPAAMM.com> a écrit dans le
message de news:e5**************@TK2MSFTNGP11.phx.gbl...
In another online group in which I participate, we were discussing a
particular piece of code that had a pretty high risk for breaking in the
future (because it depended on something not changing that was outside the developer's control) but was slightly more performant. One participant
posted:

"I tend to take the performance track also, adding readability if the

impact
isn't too great. There is also an odd reality that takes place even in

the
software field however. Not sure I can explain it too well but simply

said,
if you write code to handle the effects of changing the ordinal position

of
a field, some other error will surface anyway."

which I found to be ridiculous to the point of being dangerous.

I know this is somewhat one of those opinion issues, but I just wondered
where everyone fell on the continuum.


Nov 15 '05 #10

P: n/a
Yeah the quote I posted actually makes two points

1) readability vs. performance
2) reliability vs. performance

with the author says he chooses performance in both cases.

My basic position tends to be that in most cases the performance differences
we're talking about our trivial (the tests used to demonstrate them often
use 10,000,000 iterations and our code is doing exactly one per hour, for
example). I can't see any wisdom whatsoever in obsessing over such trivial
performance gains and sacrificing what may well be hours of future work on
the code.

As far as the reliability, I think the position of writing code that is
pretty likely to break some day because it performs slightly better is just
ludicrous. In this particular case, it was in the context of database
access, which I think will far overshadow any milliseconds saved here or
there by writing theoretically perfect code.

"J.Marsch" <je****@ctcdeveloper.com> wrote in message
news:ek*************@tk2msftngp13.phx.gbl...
For what it's worth I would almost always side with reliability. I hate
fixing code, especially when it's due to a bad decision on my behalf. I
suppose that if I were in a situation where some code absolutely had to run faster (show-stopper), and the modification was the only way to get over the line, and the reliability problem was not random, but a maintenance issue,
(only a bug if the table structure changes), and the reliability problem
would not cause more damage than running too slow..

Then maybe I would bring it up in a meeting, just so that my coworkers could talk me out of it.

Regards

Nov 15 '05 #11

This discussion thread is closed

Replies have been disabled for this discussion.