By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,742 Members | 1,227 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,742 IT Pros & Developers. It's quick & easy.

Timeless Classics of Software Engineering

P: n/a
I'd like to hear thoughts on what books, in your opinion, are true
classics in the field of software engineering. I read a lot on the
topic - at least a book a month for many years. There are many good
authors, however, the only book on making software that is truly
timeless, in my opinion, is "Mythical Man Month" by Brooks. It never
ceases to amaze me that something written over 20 years ago would be
so relevant.

It seems like Brooks achieved this by focusing on what is the essence
of software engineering, which is comprised of:

A) building models of reality.
B) the people who tend to like building models of reality, what they
are like, and what makes them work together effectively.

Many books focus excessively on a particular language, a specific
domain, on project management, Gantt charts etc and miss the forest
for the trees.

Note that I'm specifically looking for books on making software, on
Software Engineering as a craft as opposed for classic books on
computer science (e.g. Knuth) which is a completely different category
in my mind.

Are there any other books like MMM that you can think of where every
page is packed with insight where it seems not a single word is in
vain?
I'd be grateful for your suggestions. There must be at least a couple
out there.

Thanks!

- Steve
Jul 22 '05
Share this Question
Share on Google+
102 Replies


P: n/a
In article <b2*************************@posting.google.com> ,
jc*****@taeus.com (Jerry Coffin) wrote:
...Likewise, the Dragon
Book could be used as a study in SE with compilers as the sample code.


That book is far more popular than it deserves to be. The authors have
a positive talent for making obscure exposition seem meaningful, without
covering the necessary material, while making what they cover more
complicated than it really is.
__________________________________________________ __________________
TonyN.:' to******@shore.net
'
Jul 22 '05 #51

P: n/a
Tony Nelson <to******@shore.net> writes:
In article <b2*************************@posting.google.com> ,
jc*****@taeus.com (Jerry Coffin) wrote:
...Likewise, the Dragon
Book could be used as a study in SE with compilers as the sample code.


That book is far more popular than it deserves to be. The authors have
a positive talent for making obscure exposition seem meaningful, without
covering the necessary material, while making what they cover more
complicated than it really is.


The first chapter of the Dragon Book is not that bad as an introduction.
But if you look for a true classic on compiler writing check Wirth's
Compilerbau. 120 pages only and it covers everything you need
to write a quite usable compiler.

-Andi

Jul 22 '05 #52

P: n/a
Clinging to sanity, jc*****@taeus.com (Jerry Coffin) mumbled into her beard:
A few might qualify as crossovers as well: just for example, Lion's
book or almost any of Tannenbaum's books could be used for studying
SE, with operating sytsems as the example code. Likewise, the Dragon
Book could be used as a study in SE with compilers as the sample
code.


There's a bit of a problem with that; when SE's are expected to build
systems that involve database usage, and they _don't_ have any
guidance as to why databases should be used one way or another, it's
pretty likely that they will re-invent the same _broken_ wheels that
others have repetitively invented over the years...

Substitute
s/database/compiler/g
s/database/linker/g
s/database/markup language/g
as needed...
--
output = ("cbbrowne" "@" "cbbrowne.com")
http://www3.sympatico.ca/cbbrowne/linuxxian.html
Rules of the Evil Overlord #24. "I will maintain a realistic
assessment of my strengths and weaknesses. Even though this takes some
of the fun out of the job, at least I will never utter the line "No,
this cannot be! I AM INVINCIBLE!!!" (After that, death is usually
instantaneous.)" <http://www.eviloverlord.com/>
Jul 22 '05 #53

P: n/a
jc*****@taeus.com (Jerry Coffin) wrote:
"Mikito Harakiri" <mi*********@iahu.com> wrote in message news:<dQ**************@news.oracle.com>...

[ ... ]
No database textbooks listed so far. Is it because
1. There are no classic database books.
Or rather
2. Software engineers usually don't know anything about databases.
I would say:

3. Because the OP asked about SE, not databases.


Oh, sure, spoil it by staying on-topic.
I, for one, can think of at least a couple I'd consider classics about
database design, just as I can think of some I'd consider classics
about compilers, operating systems, networking, etc.

A few might qualify as crossovers as well: just for example, Lion's
book or almost any of Tannenbaum's books could be used for studying
SE, with operating sytsems as the example code. Likewise, the Dragon
Book could be used as a study in SE with compilers as the sample code.

Likewise, almost anything by C.J. Date or E.F. Codd could qualify as a
more or less timeless classic, but none of them is more than
tangentially related to SE.
Agreed.
OTOH, I'd say Robert Heinlein or F. Paul Wilson might have just as
relevant of messages for software engineers as Date or Codd...


How about Robert A. (Anton) Wilson? <EG>

Fnord.

Sincerely,

Gene Wirchenko

Computerese Irregular Verb Conjugation:
I have preferences.
You have biases.
He/She has prejudices.
Jul 22 '05 #54

P: n/a
Tony Nelson <to******@shore.net> wrote in message news:<to****************************@news.primus.c a>...
In article <b2*************************@posting.google.com> ,
jc*****@taeus.com (Jerry Coffin) wrote:
...Likewise, the Dragon
Book could be used as a study in SE with compilers as the sample code.


That book is far more popular than it deserves to be. The authors have
a positive talent for making obscure exposition seem meaningful, without
covering the necessary material, while making what they cover more
complicated than it really is.


Well, the dragon book isn't bad, but I think a compiler book which
covers more recent developments would be needed.

I think compilers have changed a lot since 70s. For instance, the book
doesn't cover how to design a compiler for a parallel programming
language (not that we know how to, but it should be emphasized that
program representation must contain information suited to our goals).
There are also a lot of parser generators nowadays that go beyond what
is discussed in the book. If anybody wants to write a C++ parser, the
book will be only marginally useful... I'm also thinking of the
wonderful combinatorial categorial parsers and the like, these can be
used for artificial languages. For instance, I've found the Parsec
parser library for Haskell to be a wonderful programming exercise.

Best Regards,

--
Eray Ozkural
Jul 22 '05 #55

P: n/a
Christopher Browne wrote:
Clinging to sanity, jc*****@taeus.com (Jerry Coffin) mumbled into her beard:
A few might qualify as crossovers as well: just for example, Lion's
book or almost any of Tannenbaum's books could be used for studying
SE, with operating sytsems as the example code. Likewise, the Dragon
Book could be used as a study in SE with compilers as the sample
code.

There's a bit of a problem with that; when SE's are expected to build
systems that involve database usage, and they _don't_ have any
guidance as to why databases should be used one way or another, it's
pretty likely that they will re-invent the same _broken_ wheels that
others have repetitively invented over the years...

Substitute
s/database/compiler/g
s/database/linker/g
s/database/markup language/g
as needed...


Amen to that, Chris.

It seems there's a normal human attitude that
"the other guy's job is easier," so we tend to
underestimate the effort involved in those
areas where we don't have expertise. Or, more
precisely, we underestimate the level of experience needed
to be reasonably adroit with a certain
"thing" (database/compiler/linker, to use your
examples).

--
"It is impossible to make anything foolproof
because fools are so ingenious"
- A. Bloch
Jul 22 '05 #56

P: n/a
Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote in message news:<u_*********************@bgtnsc04-news.ops.worldnet.att.net>...
It seems there's a normal human attitude that
"the other guy's job is easier," so we tend to
underestimate the effort involved in those
areas where we don't have expertise.
The problem with the Information Technology camp is that most
practicioners have not expertise in Information Technology, but only
in isolated parcels.
Or, more
precisely, we underestimate the level of experience needed
to be reasonably adroit with a certain
"thing" (database/compiler/linker, to use your
examples).


Most software developers and engineers underestimate the level of
knowledge about databases needed to be a reasonabily compentent IT
professional. (But I am trying to cease to be among them :)
Regards
Alfredo
Jul 22 '05 #57

P: n/a
Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote:

[snip]
It seems there's a normal human attitude that
"the other guy's job is easier," so we tend to
underestimate the effort involved in those
areas where we don't have expertise. Or, more
precisely, we underestimate the level of experience needed
to be reasonably adroit with a certain
"thing" (database/compiler/linker, to use your
examples).


I think this is because one can easily understand the high-level,
abstract description (or executive summary) of what has to be done.
If one makes the mistake of thinking that that is all there is to the
work, one gets the above effect. The same mistake is not made in an
area that one is proficient in, because one knows about the
difficulties.

Sincerely,

Gene Wirchenko

Computerese Irregular Verb Conjugation:
I have preferences.
You have biases.
He/She has prejudices.
Jul 22 '05 #58

P: n/a
Gene Wirchenko wrote:
Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote:

[snip]

It seems there's a normal human attitude that
"the other guy's job is easier," so we tend to
underestimate the effort involved in those
areas where we don't have expertise. Or, more
precisely, we underestimate the level of experience needed
to be reasonably adroit with a certain
"thing" (database/compiler/linker, to use your
examples).

I think this is because one can easily understand the high-level,
abstract description (or executive summary) of what has to be done.
If one makes the mistake of thinking that that is all there is to the
work, one gets the above effect. The same mistake is not made in an
area that one is proficient in, because one knows about the
difficulties.

Sincerely,

Gene Wirchenko


Good point, Gene.

The corollary to your observation would be that since
management only ever reads executive summaries, they
presume all tasks are relatively trivial and can be
outsourced.

(But that's a subject for a different thread and which has
been rehashed many times over in various newsgroups.)

NPL

--
"It is impossible to make anything foolproof
because fools are so ingenious"
- A. Bloch
Jul 22 '05 #59

P: n/a
Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote in message news:<4w********************@bgtnsc04-news.ops.worldnet.att.net>...
Gene Wirchenko wrote:
Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote:

[snip]

It seems there's a normal human attitude that
"the other guy's job is easier," so we tend to
underestimate the effort involved in those
areas where we don't have expertise. Or, more
precisely, we underestimate the level of experience needed
to be reasonably adroit with a certain
"thing" (database/compiler/linker, to use your
examples).

I think this is because one can easily understand the high-level,
abstract description (or executive summary) of what has to be done.
If one makes the mistake of thinking that that is all there is to the
work, one gets the above effect. The same mistake is not made in an
area that one is proficient in, because one knows about the
difficulties.

Sincerely,

Gene Wirchenko


Good point, Gene.

The corollary to your observation would be that since
management only ever reads executive summaries, they
presume all tasks are relatively trivial and can be
outsourced.


On the other hand, professionals tend to overestimate the unique
nature of their experience and underestimate the abilities of other
professionals, esp. when they don't know them personally. In effect,
management is right about outsourcing much more often than we,
professionals, willing to admit.
(But that's a subject for a different thread and which has
been rehashed many times over in various newsgroups.)

NPL

Jul 22 '05 #60

P: n/a
al************@yahoo.com (Michael S) wrote:
Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote in message news:<4w********************@bgtnsc04-news.ops.worldnet.att.net>...
Gene Wirchenko wrote:
> Nick Landsberg <SP*************@SPAMworldnetTRAP.att.net> wrote:
>
> [snip] >>It seems there's a normal human attitude that
>>"the other guy's job is easier," so we tend to
>>underestimate the effort involved in those
>>areas where we don't have expertise. Or, more
>>precisely, we underestimate the level of experience needed
>>to be reasonably adroit with a certain
>>"thing" (database/compiler/linker, to use your
>>examples). > I think this is because one can easily understand the high-level,
> abstract description (or executive summary) of what has to be done.
> If one makes the mistake of thinking that that is all there is to the
> work, one gets the above effect. The same mistake is not made in an
> area that one is proficient in, because one knows about the
> difficulties.
Good point, Gene.

The corollary to your observation would be that since
management only ever reads executive summaries, they
presume all tasks are relatively trivial and can be
outsourced.
On the other hand, professionals tend to overestimate the unique
nature of their experience and underestimate the abilities of other
professionals, esp. when they don't know them personally. In effect,
Yes.
management is right about outsourcing much more often than we,
professionals, willing to admit.


That does not follow. If management does not know the
ins-and-outs of an area, their decision to outsource it (or do
anything else of significance in the area) is only a guess. If they
are right, it is only by a fluke. This is not the way I want to see
management done (or anything at all, if there is a choice).

Obviously, there has to be a balance point at which one does know
enough about an area in order to make intelligent decisions about it.

Any manager making such decisions had better be a good systems
analyst. I am using the term in a wide sense, not confining it to
computers. (I have found my systems analysis skills to be very
portable. Given knowledge about an area, I can start doing analysis.)
(But that's a subject for a different thread and which has
been rehashed many times over in various newsgroups.)


Sincerely,

Gene Wirchenko

Computerese Irregular Verb Conjugation:
I have preferences.
You have biases.
He/She has prejudices.
Jul 22 '05 #61

P: n/a
"Nick Landsberg" <SP*************@SPAMworldnetTRAP.att.net> wrote in message
news:4w********************@bgtnsc04-news.ops.worldnet.att.net...

The corollary to your observation would be that since
management only ever reads executive summaries, they
presume all tasks are relatively trivial and can be
outsourced.


It's just human nature. What you can't see, you're not aware
of; what you're not aware of, you can't manage.

I was a programmer for 20 years. I noticed that often, for
myself and co-workers, an estimate for a job would be
produced, and management would question it. Why does
it take so long? It doesn't seem like it should take so long.

Then I became a manager.

After a fairly short time, I noticed that I would ask people for
estimates, and they would turn them in, and I would look at
them and say, why does it take so long? It doesn't seem like
it should take so long. And it was absolutely how I felt, and
you certainly can't say it's because I didn't know enough about
programming, because I'd been doing it for 20 years.

It's the missing details that befuddle.

The particularly ironic thing about this is that most programmers'
estimates are not in fact overly long, but wildly optimistic!

Now when I see an estimate and I can't understand why it's
so long, I ask for more details. They're usually forthcoming,
and then the problem doesn't seem so easy anymore.
Marshall
Jul 22 '05 #62

P: n/a
"Sergio Navega" <sn*****@intelliwise.com> wrote in message news:<41**********@news.athenanews.com>...
"Marshall Spight" <ms*****@dnai.com> escreveu na mensagem
news:epPNc.177055$IQ4.107932@attbi_s02...
"Steve Johnson" <st**************@yahoo.com> wrote in message

news:94**************************@posting.google.c om...
I'd like to hear thoughts on what books, in your opinion, are true
classics in the field of software engineering.


I can't vouch for it myself, but I hear a lot of people mention
"Code Complete" by Steve McConnell.


I also vote for "Code Complete". It is a remarkable (although excessively
lenghy) work. If you don't want to face its 850+ pages, there's a smaller
alternative:

Maguire, Steve (1993) Writing Solid Code. Microsoft Press.


Both are **very** good.
If you can, I highly recommend both.

Best,
John

John Torjo
Freelancer
-- jo**@torjo.com

Contributing editor, C/C++ Users Journal
-- "Win32 GUI Generics" -- generics & GUI do mix, after all
-- http://www.torjo.com/win32gui/

Professional Logging Solution for FREE
-- http://www.torjo.com/code/logging.zip (logging - C++)
-- http://www.torjo.com/logview/ (viewing/filtering - Win32)
-- http://www.torjo.com/logbreak/ (debugging - Win32)
(source code available)
Jul 22 '05 #63

P: n/a

st**************@yahoo.com (Steve Johnson) writes:
I'd like to hear thoughts on what books, in your opinion, are true
I tried to summarize the suggestions, but it turns out that while you
specifically asked for:
Note that I'm specifically looking for books on making software, on
Software Engineering as a craft as opposed for classic books on
computer science (e.g. Knuth) which is a completely different category
in my mind.


....there's a lot of more tehcnical books suggested, including Knuth.
Here are my summary, if somebody will take on the job of separating
the software engineering ones from the technical ones, be my guest.

(I added the popularity count when a book was suggested by more than
one person, or supported in following posts)

---8<---[snip]---8<---

Marshall Spight:

I can't vouch for it myself, but I hear a lot of people mention
"Code Complete" by Steve McConnell.

+6

Sergio Navega (alternative:)
Maguire, Steve (1993) Writing Solid Code. Microsoft Press.

+2

Socks (puppet_sock):

The "effective C++" [by scott myers] books are excellent also, and available
as a package for cheap on CD.

Another book I quite enjoyed was: _Death March_ by Yourdon.

xpyttl:

Steve's "Debugging the Development process" ain't too shabby, either, and
it's a lot shorter.

DeMarco has quite a number of good books on the topic, but his "The
Deadline" is by far the most entertaining, most fun, and most on-target book
I've read on the subject of what makes a project tick.

Rob Warnock:

My favorite Tom DeMarco book is "Controlling Software
Projects: Management, Measurement, and Estimatation".
The first chapter starts out with the classic reminder:
"You can't control what you can't measure."

stephen fuld:

I would suggest Programming Pearls by Jon Bentley. It is one of those books
that is actually fun to read as it is so packed with insights that you
frequently find yourself having that Aha! experience.

+1

JXStern:

The only book that *concise* I can think of is Kernigan and Ritchie,
"The C Programming Language", but maybe it's too techie for your
category.

+2

Kernighan and Plauger, "Elements of Programming Style", never quite did
it for me, but others might name it.

+1

Booch's old "Object Oriented Design" had some status for a while.

+1

I like Gerald Weinberg's stuff, esp the "Quality Software Management"
series, but it's not as tight as Brooks.

Shailesh Humbad:

The textbook at my Univ of Michigan software engineering class was,
"Software Engineering: A Practitioner's Approach", by Roger
S. Pressman

Finally, there's a classic book in urban design called "A Pattern
Language : Towns, Buildings, Construction" by Christopher
W. Alexander.

Donald F. McLean:

Design Patterns - Gamma et al.

+1

Refactoring - Fowler
How to Write a Useable User Manual - Weiss

Ron Ruble:

"Leadership And Self-Deception", by the Arbinger Institute.

Dave Townsend:

Take a look at Glenford Myers, the Art of software Testing....

Gavin Scott:

"Software Pioneers", a book that presents 16 of the classic papers by
pioneers in the software field. The book grew out of a conference

Alan Gauld:

- MMM by Brooks - undoubtedly deserves top place
- Peopleware by Lister - would be close too.
- Structure & Interpretation of Computer Programs by Sussman et al

+1

More debatably:

- Knuth's 3 volumes on algorithms - but more people talk
about them than have read them I suspect!

+1

- UML distilled by Fowler might make it into the
classics category if UML really does become the
standard notation.

Chris Schumacher

In my personal experience Kernighan&Plaugher's "Software Tools" is a
good book

H. E. Taylor:

Add _The Pragmatic Programmer_ by Hunt & Thomas to your list.

-1?

Andrew Reilly

I like Bertrand Meyer's "Object Oriented Software Construction" 2nd
rev. Certainly a lot of detail to think about.

At Uni, long ago, I was taught from "Software Engineering" by
I. Sommerville.

More programming-specific and more beginner-level than you're after,
but very beautiful is "Data Structures, with Abstract Data Types and
Pascal", by Stubbs and Webre.

David Lightstone:

Programmers and Managers - The Routinization of Computer Programming in the
United States by Philip Kraft (cerca 1977).

Testing in Software Development by Martyn Ould and Charles Unwin

Managing Software Quality and Business Risk by Martyn Ould

Scott Moore:

Compiler construction for digital computers, David Gries.

Pascal Users Manual and Report, Jensen and Wirth. The forever unfufilled
dream that programs could be clean and easy to understand.

Basic Basic, Coan. Don't laugh, most early homebrew computer users
read this book. It taught a generation of microcomputer programmers to
program.

Principles of Compiler Design, Aho and Ullman. Aka the "dragon book",

Programming languages: history and fundamentals, Sammet. First (and last)
real look at where programming languages came from, and are going to.

Unix Programmers Manual, Vol 1 and 2, Bell labs.

Writing Interactive Compilers and Interpreters, P. J. Brown.

Roy Omond:

"The Psychology of Computer Programming" by Gerald Weinberg
"The Psychology of Everyday Things" by Don Norman

Andi Kleen:

I always liked "Debugging C" from Robert Ward.

The first chapter of the Dragon Book is not that bad as an introduction.
But if you look for a true classic on compiler writing check Wirth's
Compilerbau

Gray/Reuter - Transaction processing: concepts and techniques

Marcelo Pinto:

Refactoring: Improving the Design of Existing Code by Fowler
Agile Software Development, Principles, Patterns, and Practices by Martin
Software Craftsmanship: The New Imperative by McBreen

Victor Putz:

I also HIGHLY recommend McConnell's "Software Project Survival Guide",

Larry Crimson:

Managing the Software Process, by Watts Humphrey.

Andy Glew:

* The "Gang of 4" book on patterns.
* I'd also put Martin Fowler's book on Refactoring into this class.

I recommend Lakos' "Large-Scale C++ Programming" to everyone now.

CELKO:

"Classics in Software Engineering" and "Writings of the Revolution" by
Edward Yourdon, both now Out of Print

Jerry Coffin:

_Programming Proverbs_ by Henry Ledgard.
-k
--
If I haven't seen further, it is by standing in the footprints of giants
Jul 22 '05 #64

P: n/a
I'm surprized how short the list contributed by previous posters is,
and agree with most of those listed that I've read. 'Code Complete' is
very, very good. His 'Rapid Development' is also good. (Since its been
listed several times and I keep hearing about it I suppose I'll have
to read 'Refactoring' now.)

I'll add 'Principles of Software Engineering Management' by Tom Gilb
and 'Notes On The Synthesis Of Form' by Christopher Alexander (who
wrote the Pattern book mentioned above)... and maybe Myers 'The Art of
Software Testing'... and 'The Capability Maturity Model' by the SEI
people.. and 'The Handbook of Walkthroughs, Inspections and Technical
Reviews' by Freedman and Weinberg...
Jul 22 '05 #65

P: n/a
jt****@yahoo.com (John Torjo) writes:
"Sergio Navega" <sn*****@intelliwise.com> wrote in message news:<41**********@news.athenanews.com>...
"Marshall Spight" <ms*****@dnai.com> escreveu na mensagem
news:epPNc.177055$IQ4.107932@attbi_s02...
> "Steve Johnson" <st**************@yahoo.com> wrote in message

news:94**************************@posting.google.c om...
> > I'd like to hear thoughts on what books, in your opinion, are true
> > classics in the field of software engineering.
>
> I can't vouch for it myself, but I hear a lot of people mention
> "Code Complete" by Steve McConnell.
>
>


I also vote for "Code Complete". It is a remarkable (although excessively
lenghy) work. If you don't want to face its 850+ pages, there's a smaller
alternative:

Maguire, Steve (1993) Writing Solid Code. Microsoft Press.


Both are **very** good.
If you can, I highly recommend both.


While we are at recommending Steve McConnell, I found "Rapid
Development" very interesting. While it's about software, I think the
point he are making are equally valid for ASIC and FPGA development.

When projects become big enough, the methods and tools for managing
them become equal.
--Kai

Jul 22 '05 #66

P: n/a
In article <05iPc.64964$8_6.28994@attbi_s04>,
Marshall Spight <ms*****@dnai.com> wrote:
I was a programmer for 20 years. I noticed that often, for
myself and co-workers, an estimate for a job would be
produced, and management would question it. Why does
it take so long? It doesn't seem like it should take so long.

Then I became a manager.


Hm. My managers always ask, "what's your multiplier again?", and I
answer "2X", and they multiply my estimate by two, and carry on.

I've never met anyone with a multipler below 1.

Followups reduced to a gropu that I read.

-- greg

Jul 22 '05 #67

P: n/a

"John Torjo" <jt****@yahoo.com> wrote in message
news:c6**************************@posting.google.c om...

I also vote for "Code Complete". It is a remarkable (although excessively lenghy) work. If you don't want to face its 850+ pages, there's a smaller alternative:


One of the earliest good books on software engineering was
Grady Booch's Software Engineering in Ada. The first
couple of chapters are particularly good.

I recently re-read the original 1969 report from the NATO
conference. Everyone with a concern about software
engineering should read that at some time or other. The
sad thing is, that after the recent reading of the report, I
realized that almost all the problems identified then, are
the problems that still confront us today.

Richard Riehle
Jul 22 '05 #68

P: n/a
Richard Riehle <ad******@earthlink.net> wrote:
One of the earliest good books on software engineering was
Grady Booch's Software Engineering in Ada. The first
couple of chapters are particularly good.

I recently re-read the original 1969 report from the NATO
conference. Everyone with a concern about software
engineering should read that at some time or other. The
sad thing is, that after the recent reading of the report, I
realized that almost all the problems identified then, are
the problems that still confront us today.


I was pleased to finally find this online:
http://homepages.cs.ncl.ac.uk/brian.randell/NATO/

Also, Royce's 1970 "waterfall" paper:
http://facweb.cs.depaul.edu/jhuang/is553/Royce.pdf

Many have mentioned Brooks' MMM, but his "No Silver Bullet"
isn't in the original MMM (it is included in the
'95 edition). "No Silver Bullet" is also online at:
http://www.virtualschool.edu/mon/Sof...verBullet.html

John
Jul 22 '05 #69

P: n/a
"Richard Riehle" <ad******@earthlink.net> writes:

[snip]
I recently re-read the original 1969 report from the NATO
conference. Everyone with a concern about software
engineering should read that at some time or other.


For those looking for it, try:
http://homepages.cs.ncl.ac.uk/brian....ATO/index.html

[chop]

cheers, Rich.

--
rich walker | Shadow Robot Company | rw@shadow.org.uk
technical director 251 Liverpool Road |
need a Hand? London N1 1LX | +UK 20 7700 2487
www.shadow.org.uk/products/newhand.shtml
Jul 22 '05 #70

P: n/a
Default User <fi********@boeing.com.invalid> wrote in message news:<41***************@boeing.com.invalid>...
JXStern wrote:
The only book that *concise* I can think of is Kernigan and Ritchie,
"The C Programming Language", but maybe it's too techie for your
category.

Kernigan and Plauger, "Elements of Programming Style", never quite did
it for me, but others might name it.

Just to note, it's spelled "Kernighan". That can be important when
searching for books.


Such as here: http://dogbert.abebooks.com/servlet/...gramming+Style


Brian Rodenborn

Jul 22 '05 #71

P: n/a

"A.G.McDowell" <mc*******@nospam.co.uk> wrote in message
news:Vz**************@mcdowella.demon.co.uk...

Classic papers on the rationale for splitting software into modules -
and therefore what splits make sense and what don't. Also a miscellany
of articles about specification, concurrency, real time systems, and so
on.

With regard to papers, one of the most important, but one that seems to
have been overlooked by many in our profession is the paper

Ross, D. T., Goodenough, J. B., and Irvine, C. A.
"Software Engineering: Process, Principles, and Goals,"
Computer, Vol. 8, No. 5 (May, 1975), pp. 17--27.

This paper has been re-printed in a number of different places,
and is important for anyone concerned with software engineering.

Since its publication, others have expanded on some of the goals
and principles, but it remains a seminal paper because it was one
of the first attempts to describe software engineering in terms of
these important ideas.

Richard Riehle
Jul 22 '05 #72

P: n/a

"Andy Glew" <gl**************@sbcglobal.net> wrote in message
news:YJ****************@newssvr27.news.prodigy.com ...

I doubt that the XP (eXtreme Programming) books will
become classics, even though I enjoy them.

I wonder whether this is likely to be the case.

In spite of the rebellion against what has been called "Talylorism"
by some in the XP community, and notwithstanding McBreen's
keen insight into what has passed for software engineering in the
past, XP might simply be another aspect of software engineering.

Engineering is not a rigid set of rules that restricts the engineer from
seeking the most effective path to a successful outcome. Quite the
contrary. A engineer is looking for the most economical choices
leading to the most dependable outcome.

Too much of what has passed for software engineering has been
characterized by perjoratives such as "document-driven design,"
"process over people," "the grand design approach," "ritual
and ceremony," etc. As we approach any project, as engineers,
we want to find the optimal path to success. For some projects,
this might require a fair amount of "ceremony" and up-front design.
For others, a model more analogous to the Unified Process could
be appropriate. And, for still others, Agile Development is the
better option. Sometimes, we might choose some combination
of those choices.

XP does make a contribution to software engineering thought, and
an important contribution at that. Many of the ideas in XP have been
around for a long time and used, however informally, in the creation
of successful software products. I can recall a rule in a programming
shop where I worked in the mid-1970's that was a lot like "pair
programming."

An good engineer will not be so dogmatic as to reject an approach
to success that actually works. There seem to be such successes
in the record for XP. Therefore, in the XP literature, there just might
be such classics waiting to be discovered and/or identified. Even so,
classics tend to be identified as such only after a respectable period
of time has passed, so it would be premature to call them classics
at this point.

Richard Riehle
Jul 22 '05 #73

P: n/a
Richard Riehle wrote:
XP does make a contribution to software engineering thought, and
an important contribution at that. Many of the ideas in XP have been
around for a long time and used, however informally, in the creation
of successful software products. I can recall a rule in a
programming shop where I worked in the mid-1970's that was a lot like
"pair programming."


IMHO, XP is a lot about things that people assume implicitely and which
in practice don't happen when you keep them implicit. You have to force
people to do them.

Pair Programming and Move People Around: People within a project need to
talk to each other. I've seen many engineers who think that this is a
waste of time, and that you can work better when you are alone in a
quiet corner, and nobody makes outsider suggestions to your design. If
you look into someone else's work, and you find a severe problem,
people often take it as insult (depends on how long you looked to see
the problem - when it takes a few seconds, and you make an ugly face,
it's worst case ;-). I've even seen other engineers jump into defense
of the "offended" person, declaring that "my analysis was wrong".
Unfortunately, the problem was real. This sort of "MY code" "YOUR
code", i.e. ego, is in the way.

Pair programming formalizes the process to have more than one person
look at what's done. There are alternative approaches like open
sourcing the software, and have the users look into the source.
However, uses don't really look at sources (and if they do, they are
not good at it), except if there's something that doesn't work. This is
also too late (e.g. the last time I had to look into GCC, I saw the
horrors of legacy code; fortunately, the GCC team feels the same pain,
and inserted a new, saner intermediate representation).

User stories: Customers fail to set proper requirements. They usually
(strongly!) suggest an implementation that supposedly solves the
problem. Since the customer is not an experienced architect (that's why
he's a customer, not the chief architect), this implementation must
have severe drawbacks. The proper way is to divide the requirements
into input/output relations, where independent requirements are
separated. XP formalizes that with "user stories" which have to fit on
a card. Note that the "user stories" both provide requirements and test
cases.

Unit tests: I think this is also related to refactoring. Traditional
programming languages make it difficult to write tests along with the
code, so people tend to just hack the code in and test the complete
application. That's wrong. As you can build a house only on bricks that
sustain the pressure, you can build software (or hardware) only on
functions (circuits) that are robust, and deliver what they should.
Thing that haven't been tested won't work.

Refactoring comes into the game by forcing the programmer to actually
build the system out of bricks, and not out of concrete walls (the
equivalent of monster-functions which are casted into stone).

Iterations: Due to the fact that the customer has no clear idea what he
really wants, even the user stories are misleading. An iterative search
approach is formalized to find what the user actually wants. It is
important to realize this (the customer is wrong) on both sides,
because typically, you can't sell a project where you already plan the
necessary iterations ahead. You have to charge the customer afterwards,
it's difficult, and you'll usually take the blame.

To summarize XP: XP is about handling incompetence. Since at least 95%
of all engineers think they have more than sufficient competence, and
it's only the other 95% of engineers surrounding them, which are
incompetent, something has to be done. If you feel better that way:
once you use XP techniques, you can say that you do everything to
increase your competence.

The waterfall model also is about handling incompetence. It tries to
avoid incompetence by assigning tasks to competent people (e.g. "choose
an architect to do the design, not a coder"), and tries to achieve
competence by specializing. The main problem here is that this model
assumes that somehow you are able to find competent people for each
step of the waterfall model, and that somehow the specialists don't
fail due to their shortsightness.

Note that you still need competent people. The productivity of excellent
people in software development is so high that the average
coffee-drinking Wallies really do negative work. This is no good news
for people who want to solve unemployment, because the simpler jobs
just go away (to China and to robots, and finally to robots in China).
The good news is that when we really get there, the remaining people
who have work are true hackers, and do it for the fun of it.

BTW: The real hacker way to deal with customer requirements is to throw
them into the bit bucket, and analyze the problem yourself (only care
about it if it is interesting, and redefine it until the solution is
trivial). Unfortunately, this often leads to software where user and
developer have to be the same sort of people, and which doesn't solve
the problems of average users (problems hackers don't have at all).

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/
Jul 22 '05 #74

P: n/a
Marshall:

"Marshall Spight" <ms*****@dnai.com> wrote in message
news:05iPc.64964$8_6.28994@attbi_s04...
"Nick Landsberg" <SP*************@SPAMworldnetTRAP.att.net> wrote in message news:4w********************@bgtnsc04-news.ops.worldnet.att.net...

The corollary to your observation would be that since
management only ever reads executive summaries, they
presume all tasks are relatively trivial and can be
outsourced.
It's just human nature. What you can't see, you're not aware
of; what you're not aware of, you can't manage.


Or so it is sometimes thought. One can account for and manage unknowns.
It's just part of the management equation, as long as one can deal with the
unexpected when it arises.
I was a programmer for 20 years. I noticed that often, for
myself and co-workers, an estimate for a job would be
produced, and management would question it. Why does
it take so long? It doesn't seem like it should take so long.
I'll bet one of the more frequent suggestions was to change the scope and
purpose of project; especially when the project management and business
people came closer and closer to a mutual understanding of the actual
business requirements.
Then I became a manager.

After a fairly short time, I noticed that I would ask people for
estimates, and they would turn them in, and I would look at
them and say, why does it take so long? It doesn't seem like
it should take so long. And it was absolutely how I felt, and
you certainly can't say it's because I didn't know enough about
programming, because I'd been doing it for 20 years.
Astute manager. :-)
It's the missing details that befuddle.

The particularly ironic thing about this is that most programmers'
estimates are not in fact overly long, but wildly optimistic!
Steve McConnell agrees.
Now when I see an estimate and I can't understand why it's
so long, I ask for more details. They're usually forthcoming,
and then the problem doesn't seem so easy anymore.


As a US citizen I'm much more inclined to realize that business conditions
can, and do, change often. As such, I need to account for unknown market
conditions, which make planning much more difficult than it might otherwise
be. That's why most successful companies want gov't to secure their
position in their markets and why I, as a struggling company, want them not
to. :-)

Bill
Jul 22 '05 #75

P: n/a
> Andy Glew wrote:
I doubt that the XP (eXtreme Programming) books will
become classics, even though I enjoy them.


With the number of "Agile" books approaching 50, the odds are high that one
or two of them might stick to the wall...

--
Phlip
http://industrialxp.org/community/bi...UserInterfaces


Jul 22 '05 #76

P: n/a

"Phlip" <ph*******@yahoo.com> wrote in message
news:AY***************@newssvr19.news.prodigy.com. ..
Andy Glew wrote:
I doubt that the XP (eXtreme Programming) books will
become classics, even though I enjoy them.

With the number of "Agile" books approaching 50, the odds are high that

one or two of them might stick to the wall...
which versions of the books, the ones that have been written, or the ones
that will be re-written (many times)
(They are being developed iteratively aren't they?)

--
Phlip
http://industrialxp.org/community/bi...UserInterfaces

Jul 22 '05 #77

P: n/a
On 1 Aug 2004 09:10:50 -0700, al************@yahoo.com (Michael S)
wrote:
On the other hand, professionals tend to overestimate the unique
nature of their experience and underestimate the abilities of other
professionals, esp. when they don't know them personally. In effect,
management is right about outsourcing much more often than we,
professionals, willing to admit.


Trying to rationalize and sooth your lousy little conscious from the
rotten/degenerate decisions you've participated in, eh. Too Late.

"An appeaser is one who feeds a crocodile--hoping it will eat him
last." - Winston Churchill'

Jul 22 '05 #78

P: n/a
> From: Bernd Paysan <be**********@gmx.de>
Unit tests: I think this is also related to refactoring. Traditional
programming languages make it difficult to write tests along with the
code, so people tend to just hack the code in and test the complete
application. That's wrong.
I agree. That's one of the winning things about interactive languages
such as LISP. When I write LISP code, I unit-test one line at a time as
I write it. When I have finished all the lines of a function, I
unit-test that function for each of the different kinds of situation it
must deal with. I include all my unit-tests as comments just after the
end of the function definition, so later if I need to change that
function I can re-do all those old unit tests and any new unit tests to
make sure the function still works after the modifications.
The real hacker way to deal with customer requirements is to throw
them into the bit bucket ...


An alternative is what might be called a "consumer-oriented hacker":
The hacker inquires about the customer's real needs until he knows what
tool the customer most urgently needs. Not a fullblown do-everything
program, just one tool that does one kind of task and actually works.
The hacker can typically make such a tool within a few hours or a
couple days. Then while the consumer is beta-testing it, the hacker
works on the next-most-urgent tool. So day after day the customer has
yet another tool to handle yet another urgent problem.
Are there any consumers interested in hiring me in this way?
Jul 22 '05 #79

P: n/a
rem642b wrote:
I agree. That's one of the winning things about interactive languages
such as LISP. When I write LISP code, I unit-test one line at a time as
I write it. When I have finished all the lines of a function, I
unit-test that function for each of the different kinds of situation it
must deal with. I include all my unit-tests as comments just after the
end of the function definition, so later if I need to change that
function I can re-do all those old unit tests and any new unit tests to
make sure the function still works after the modifications.


Manual or automated unit tests?
The real hacker way to deal with customer requirements is to throw
them into the bit bucket ...


An alternative is what might be called a "consumer-oriented hacker":
The hacker inquires about the customer's real needs until he knows what
tool the customer most urgently needs. Not a fullblown do-everything
program, just one tool that does one kind of task and actually works.
The hacker can typically make such a tool within a few hours or a
couple days. Then while the consumer is beta-testing it, the hacker
works on the next-most-urgent tool. So day after day the customer has
yet another tool to handle yet another urgent problem.
Are there any consumers interested in hiring me in this way?


If you refactor your code together between each feature, and if your
bug-rate is absurdly low, and if each feature makes you faster, then what's
the problem?

--
Phlip
http://industrialxp.org/community/bi...UserInterfaces

Jul 22 '05 #80

P: n/a
Phlip wrote:
Manual or automated unit tests?


In interactive languages, there's no real differences. When you start
programming a function, you define it on the command prompt, as you do with
the manual unit tests. If it works (and the tests are supplied with the
proper commands), you can cut&paste them from the command history into your
source code, and then you have automated unit tests.

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/
Jul 22 '05 #81

P: n/a
"Phlip" <ph*******@yahoo.com> wrote in message
news:NU****************@newssvr19.news.prodigy.com ...
rem642b wrote:

...
I agree. That's one of the winning things about interactive languages
such as LISP. When I write LISP code, I unit-test one line at a time as
I write it. When I have finished all the lines of a function, I
unit-test that function for each of the different kinds of situation it
must deal with. I include all my unit-tests as comments just after the
end of the function definition, so later if I need to change that

....
I leave my unit tests in uncommented so that they execute after each
function is compiled,
that way I can be that they are applied each time.

This cannot always been done though, as some tests can take a long time to
run if you are testing a
large number of input combinations.

Rene.
Jul 22 '05 #82

P: n/a
Bernd Paysan <be**********@gmx.de> wrote:
snipped most of a fabulous post about XP and engineering.

I agree with most all of what you wrote, except for the very last bit:
BTW: The real hacker way to deal with customer requirements is to throw
them into the bit bucket, and analyze the problem yourself (only care
about it if it is interesting, and redefine it until the solution is
trivial). Unfortunately, this often leads to software where user and
developer have to be the same sort of people, and which doesn't solve
the problems of average users (problems hackers don't have at all).


It is not necessarily so, that hacker's can't think like regular users
and solve regular users problems. It may currently be the case, in
general, but it is not a necessary state of affairs. The technical and
engineering community needs to take a masonic attitude toward this
problem and start 'making better hackers.'

Many, if not most, hackers started out as regular users and should be
able to recall what it felt like to deal with the recalcitrant
machine: I can certainly recall the feeling of helplessness I
experienced for the first few years I used unix, and I try to channel
that feeling into my programs and documentation.

The disciplines of user-interface design and HCI (human-computer
interaction), at their best, are attempts to focus the attention of
technical people on the problems and perceptions of non-technical
users. If you, as a software developer, can manage to think like a
non-technical user, even if only occasionally, you will have gone a
long way toward making better programs. Of course, the requirement for
user reviews and focus groups is an admission that such focus is
difficult to maintain, and that formal processes may be required as
occasional reminders.

Still, as a hacker, it should not be so difficult to put yourself in
other peoples' shoes. In the commercial environment you may have to
ignore some things in the interest of meeting a deadline or a budget,
but the private hacker has no such limitations. It requires only a
change of attitude, not of constitution. I simply don't believe that
hackers are inately incapable of thinking like non-technical users,
only that they are unaccustomed to it.

To bring this back on topic, there are a few books that I like for
discussing user interface design:

* 'Tog on Interface' and 'Tog on Software Design'
by Bruce Tognazinni

* 'The Elements of Programming Style'
by Kernighan and Plauger
chapter 5 on input and output

* 'The Practice of Programming'
by Kernighan and Pike
chapter 4 on interfaces (last part on user interfaces)

* 'Programming as if People Mattered'
by Nathaniel Borenstein

* 'The Humane Interface'
by Jef Raskin

- Jeff Dutky
Jul 22 '05 #83

P: n/a
Jeffrey Dutky wrote:

Many, if not most, hackers started out as regular users and should be
able to recall what it felt like to deal with the recalcitrant
machine: I can certainly recall the feeling of helplessness I
experienced for the first few years I used unix, and I try to channel
that feeling into my programs and documentation.


Well, yes.

Maybe it's different in Linux land, but in Windows I struggle daily with
examples of bad UI, rude software and inadequately documented systems. I
can think of no-one better qualified to defend the interests of
end-users.

Of course, the most horrendous errors are when the reality is buried
under
a pile of marketing. If you lie to your end-users, you can't really
expect
them to understand your system, can you?
Jul 22 '05 #84

P: n/a
> Ross, D. T., Goodenough,

Goodenough - good enough software? <tongue in cheek>

Jul 22 '05 #85

P: n/a
_
They can't or won't because they develope into egotistical know-it-alls
that will devalue anyone that is lower than themselves. The machine is
just a machine, the langauge is a language to solve problems in.

--chris
reply: gr**********@gamebox.net
On 11 Aug 2004 15:47:51 -0700, Jeffrey Dutky <du***@bellatlantic.net>
wrote:
Many, if not most, hackers started out as regular users and should be
able to recall what it felt like to deal with the recalcitrant
machine: I can certainly recall the feeling of helplessness I
experienced for the first few years I used unix, and I try to channel
that feeling into my programs and documentation.

Jul 22 '05 #86

P: n/a
"_" <no*********@aol.com> wrote in message
news:opscz2ymuqs2eu50@datamave-icwjaz...
On 11 Aug 2004 15:47:51 -0700, Jeffrey Dutky <du***@bellatlantic.net>
wrote:
Many, if not most, hackers started out as regular users and should be
able to recall what it felt like to deal with the recalcitrant
machine: I can certainly recall the feeling of helplessness I
experienced for the first few years I used unix, and I try to channel
that feeling into my programs and documentation.


They can't or won't because they develop into egotistical know-it-alls
that will devalue anyone that is lower than themselves. The machine is
just a machine, the language is a language to solve problems in.


Many of us started out, not as regular users, but as programmers in school
or on our own. So our first programs were written for ourselves. We figured
it out the hard way, because there was no one else who knew any more than we
did. That's where a lot of the "figure it out yourself" mentality comes
from - it's how we learned, and really the only way TO learn anything.

Read about it, try it out, figure out the mistakes, fix it. Repeat.

--
Mabden
Jul 22 '05 #87

P: n/a
> From: "Phlip" <ph*******@yahoo.com>
Manual or automated unit tests?


Currently when I'm just writing software for my own use, nobody else
ever looks at my code or runs it, it's all manual tests. I write a new
function one line at a time, manually checking it's correct before
proceeding to write the next line of code. At the top of the function
are SETQs for the parameters to have canned values for testing purpose.
When I finish writing and testing every line of code, I wrap it into a
function definition, and comment out the SETQs at the start, so it uses
the parameters as given instead of my canned test values. I then
copy&paste the function declaration and those test SETQs, several
different sets of tests in some cases, and edit the copy to yield a
test function call, which I then immediately try. Then I comment out
that test-call and leave it sitting permanently as such a comment
immediately after the function.

But back when I was doing A.I. research at Stanford, working on an
English language command-language for a simulated robot, I made a much
more formal test rig, whereby I collected actual input to the parser
and output from the parser, and when I ran a test it told me each place
where the output wasn't the same as before, so I could check whethe
those discrepancies were bugfixes or oops. I would expect something
similar when coding for a company in the future. Also by making a test
rig for each function, I could demonstrate day-by-day progress to my
supervisor, hypothetically anyway if said supervisor were at all
interested.
Are there any consumers interested in hiring me in this way?

If you refactor your code together between each feature, and if your
bug-rate is absurdly low, and if each feature makes you faster, then
what's the problem?


I don't know anybody who has surplus money and who has any desire for
me to write any software for them. No company in this whole SF bay area
is hiring programmers. This recession has been running for a long time
with no serious sign of let-up yet. I have only a few more months to
find a decent source of income, or become homeless.

Oh, back on topic of testing: If more than one programmer works on the
same function, then of course automated testing is essential!
Jul 22 '05 #88

P: n/a
rem642b wrote:
From: "Phlip"
Manual or automated unit tests?
Currently when I'm just writing software for my own use, nobody else
ever looks at my code or runs it, it's all manual tests. I write a new
function one line at a time, manually checking it's correct before
proceeding to write the next line of code.


If you automated the check for that line of code, you could leverage a trail
of tests to go faster.
At the top of the function
are SETQs for the parameters to have canned values for testing purpose.
When I finish writing and testing every line of code, I wrap it into a
function definition, and comment out the SETQs at the start, so it uses
the parameters as given instead of my canned test values.
My cod. You just told me you do half of test-first. But then you comment the
tests out and don't preserve them.

No matter how fast and bug-free your code, and how clean your design, you'd
be faster, free-er and cleaner by preserving and leveraging those tests!
I don't know anybody who has surplus money and who has any desire for
me to write any software for them. No company in this whole SF bay area
is hiring programmers.


I heard a rumor than everyone with significant "XP" on their resume, in the
Bay Area, was hitched. But I wouldn't know...

But I don't mean "Windows XP" ;-)

--
Phlip
http://industrialxp.org/community/bi...UserInterfaces
Jul 22 '05 #89

P: n/a

"Phlip" <ph*******@yahoo.com> wrote in message
news:WS*****************@newssvr31.news.prodigy.co m...

I heard a rumor than everyone with significant "XP" on their resume, in the Bay Area, was hitched. But I wouldn't know...

In the SF Bay area, as far north as Napa, as far south as Salinas,
as far East as Merced, one keeps running into Pizza chefs, store
clerks, security guards, insurance salespersons, real estate sale
people, handymen, etc., who claim that they used to be computer
programmers. Throughout Silicon Valley, one meets former
engineers whose skill set was so narrowly focused that they
were laid off in the last round. It is sometimes amazing to me,
and humbling, that so many highly educated, well-trained,
technologists are now engaged in low-paying service industry
jobs, even as some large companies continue to import
specialized engineers from abroad.

Richard Riehle
Jul 22 '05 #90

P: n/a
Richard Riehle wrote:
In the SF Bay area, as far north as Napa, as far south as Salinas,
as far East as Merced, one keeps running into Pizza chefs, store
clerks, security guards, insurance salespersons, real estate sale
people, handymen, etc., who claim that they used to be computer
programmers.


Maybe they wrote lots of bugs...

(I know I know - many dot-coms were pumped and dumped, based on investors'
abilities to blame programmers, regardless of their proficiency.)

--
Phlip
http://industrialxp.org/community/bi...UserInterfaces
Jul 22 '05 #91

P: n/a
> From: Bernd Paysan <be**********@gmx.de>
When you start programming a function, you define it on the command
prompt, as you do with the manual unit tests.
There are at least three UI configurations where this is the opposite
of what's actually done:
- When using a CL IDE, such as Macintosh Allegro CommonLisp that I used
on my Mac Plus before it died: The user doesn't type code into the
command window, but rather composes code in the edit window, then uses
command-E to execute whatever s-expression is adjacent to the cursor.
- When using EMACS-LISP, something similar I presume.
- What I'm doing currently: McSink text editor on Macintosh, VT100
emulator connecting me to Unix, CMUCL running on Unix. I compose code
in a McSink window, copy and paste to VT100 window whereupon it is
transmitted to Unix and fed on stdin to CMUCL.
If it works (and the tests are supplied with the proper commands),
you can cut&paste them from the command history into your source
code, and then you have automated unit tests.


I don't know about EMACS-LISP, but in both MACL and McSink/VT100/CMUCL,
there's a scrolling transcript of the current Listener or dialup
session respectively. I already have the "command" (the s-expression
fed into READ-EVAL) locally in my edit window. If I want to retain a
copy of the output (from PRINT), that's the only part I need to copy
from the Listener/dialup window and paste into my edit window. But for
manual tests, I just eyeball the result when testing, it's obvious
whether it worked or not. Sometimes if the result is just a number,
such as an index into a string, which I can't tell if correct or not
just by looking at it, then I'll copy the result into a comment
alongside the "command" expression (after first doing an additional
check to make sure it was correct, for example to test the result of a
SEARCH or POSITION, I do a SUBSEQ call to see the part of the string
starting or ending with what it had allegedly found so I know it found
the correct thing, especially for example in my recent work where I'm
parsing 30k HTML files which are Yahoo! Mail output and the index where
something was found is typically 15-20k into the long string).

So anyway, it's pretty easy to collect whatever you need, input and/or
output from a test, as you go along, except for very long strings and
large structures where you don't want to include verbatim the whole
thing but instead want to save it to a file and have your test rig read
the file to get input for the function under test. Very flexible what
to actually do from moment to moment as needed...

If I were getting paid, and I'm not the only programmer working on the
code, I'd want to set up something more formal: Each test-data input
file would be formally registered and kept as-is with nobody allowed to
change it without permission. Then a test suite for a batch of code
could confidently perform some sort of read of that file to get the
first item of test data, pass that data through the first function and
compare output with what was supposed to be the output, then pass that
output and possibly more canned test data to the next function, etc.
testing all the various functions one-by-one in sequence from raw input
through processing stages to final output.

Often any single one of those major data-processing-pipeline functions
is composed of calls to several auxiliary functions. It'd be easy to
use that sequence of calls to directly produce a test rig, based on the
master data flow, for each of those auxiliary functions. So the
finished test suite would, after loading the canned-test-data file,
first test all calls to auxiliary functions in sequence within the
dataflow of the single first major pipeline function, then test that
one main dataflow function as a gestalt, then likewise test inside then
all of second, etc. down the line. Of course if one of the auxiliary
functions is itself composed of pieces of code that needs to be tested,
the same breakdown could be done another level deeper as needed (test
parts in sequence before testing whole as gestalt). Of course for
functions that take small easily expressed parameters, instead of huge
strings, they could be tested with literal constant data instead of
data from dataflow from canned-test-data file. For testing boundary
conditions, error conditions, etc., this would be useful. What about
functions that take huge inputs but where it'd be nice to test boundary
cases? Well then we just have to contrive a way to generate
boundary-case input from the given valid canned-test-data file, or
create a new canned-test-data file just for these exceptional cases.

I wish somebody would hire me to do programming work for them, so I
could put my ideas into commerical practice...
Jul 22 '05 #92

P: n/a
In article <Xg*****************@newssvr16.news.prodigy.com> ,
"Phlip" <ph*******@yahoo.com> wrote:
Richard Riehle wrote:
In the SF Bay area, as far north as Napa, as far south as Salinas,
as far East as Merced, one keeps running into Pizza chefs, store
clerks, security guards, insurance salespersons, real estate sale
people, handymen, etc., who claim that they used to be computer
programmers.


Maybe they wrote lots of bugs...

(I know I know - many dot-coms were pumped and dumped, based on investors'
abilities to blame programmers, regardless of their proficiency.)


To be fair here, are these people who assumed that stringing HTML 3.x
together counted as programming? God knows 6 yrs ago Silicon Valley was
full of those.

Maynard
Jul 22 '05 #93

P: n/a
Maynard Handley <na****@name99.org> wrote in message news:<name99-D9D8FF.00472821082004@localhost>...
In article <Xg*****************@newssvr16.news.prodigy.com> ,
"Phlip" <ph*******@yahoo.com> wrote:
Richard Riehle wrote:
In the SF Bay area, as far north as Napa, as far south as Salinas,
as far East as Merced, one keeps running into Pizza chefs, store
clerks, security guards, insurance salespersons, real estate sale
people, handymen, etc., who claim that they used to be computer
programmers.


Maybe they wrote lots of bugs...

(I know I know - many dot-coms were pumped and dumped, based on investors'
abilities to blame programmers, regardless of their proficiency.)


To be fair here, are these people who assumed that stringing HTML 3.x
together counted as programming? God knows 6 yrs ago Silicon Valley was
full of those.


Well now it's XML and Visual Basic/C#, what's the difference?

Cheers,

--
Eray
Jul 22 '05 #94

P: n/a
Two contributions I haven't seen in this thread :

The Practice of Programming (Kernighan and Pike)

and (maybe, one day)

Joel on Software, Joel Spolsky

Chris
--
Chris Morgan
"Post posting of policy changes by the boss will result in
real rule revisions that are irreversible"

- anonymous correspondent
Jul 22 '05 #95

P: n/a
Thanks for a stimulating topic.

I heartily agree that Mythical Man Month is essential reading for
anyone who wants to understand large scale software projects.

The other essential on my book case is Lakos' "Large Scale C++ Software
Design". It's applicable to any language and has enough rationale that's
grounded in real development practices and the problems of large scale
projects that I think it's relevant to the original topic.

A few years ago, I happened to reread Brooks and wrote up a collection
his insights that resonated with me. I've attached it below in hopes of
whetting the appetite of anyone who hasn't already read it and as a reminder
for those who haven't reread it recently. I encourage everyone to
(re)read the full book.

Eric

==============

Notes from re-reading "The Mythical Man-Month" by Fredrick P. Brooks, Jr.

I went looking for a quotation about the value of "system design" and
ended up reading most of the book because it has many insights into the
challenges of producing large-scale software projects.

Some highlights:

In the preface Brooks says that while OS/360 had some "excellencies in design
and execution", it had some noticable flaws that stem from the design process.

- "Any OS/360 user is quickly aware of how much better it should be."
- OS/360 was late, took more memory than planned, cost several times the
estimate, and did not perform very well until several releases after the
first.

His central argument is:

- "Briefely I believe that large programming projects suffer management
problems different in kind from small ones, due to division of labor. I
believe the critical need to be the preservation of the conceptual
integrity of the product itself."

Why are industrial teams apparently less productive than garage duos?
- Must look at what is being produced.
- program: complete in itself, written by author for own use
- programming product: more generalized, for use by others
- programming system: collection of interacting programs with
interfaces and system interactions.
- programming system product: both product and system

Interesting exposition of the "joys of the craft" of programming:
- joy of making things
- pleasure of making things that are useful to others
- fascination of making complex puzzle-like objects
- joy of always learning due to non-repeating nature of the task
- delight in working in such a tractable medium

Also "woes of the craft"
- one must perform perfectly
- others set objectives, provide resources, furnish information
- dependence on others' poor programs
- finding nitty bugs is just work
- linear convergence of debugging
- appearance of obsolescence by time you ship when you compare what you
ship to what others imagine

An analysis of sources of programmer optimism:
- creative activity comprises the idea, the implementation, the
interaction with the user
- tractability of medium leads us to believe it should implement easily
- other media place constraints on what can be imagined and limitations of
media mask mistakes in the ideas

Man-month: fallacy of lack of communication or serialization

Naive preference for "small sharp team of first-class people" ignores the
problem of how to build a *large* software system.

Harlan Mills proposed "surgical team" approach. [Not applicable everywhere.]

Conceptual integrity:
- Analogy to architectural unity of Reims cathedral vs. others that were
"improved" inconsistently.

- "I will contend that conceptual integrity is the most important
consideration in system design. It is better to have a system omit
certain anomalous features and improvements, but to reflect one set of
design ideas, than to have one that contains many good but independent
and uncoordinated ideas."

- The purpose of a programming system is to make a computer easy to
use. [We may modify purpose to be to make it easy to do the things that
our customers need done.]
- Ratio of function to conceptual complexity is the ultimate test of
system design.
- For a given level of function, that system is best in which one can
specify things with the most simplicity and straightforwardness.

Careful division of labor between architecture and implementation allows
conceptual integrity in large projects.
- Architecture: complete and detailed specification of the user interface
(for OS/360 the programming manual).
- We may want to consider what is the right specification for <project>
- Architect is "user's agent". Brings "professional and technical
knowledge to bear in the unalloyed interest of the user."
- Architecture tells what happens, implementation tells how.

Argues that designing implementations is equally creative work as
architecture. Cost-performance ratio depends most heavily on implementer;
ease of use most heavily on architect.

External provision of architecture enhances creativity of implementors. They
focus on what they uniquely do. Unconstrained most thought and debate goes
into archtectural decisions with not enough effort on implementation.

Experience shows that integral systems go together faster and take less time
to test.

Need coordination and feedback between architect and builder to bound
architectural enthusiasm.

Second system effect:
- overextend and add too many bells and whistles
- may spend too much optimizing something being superceded by events

Communication & decision making
- Strong belief in written specifications
- architects meetings
- emphasis on creativity in discussions
- detailed change proposals come up for decisions
- chief architect presides & has decision making power
- broad "supreme court" sessions handle backlog of issues, gripes, etc.

Organization:
- Talks of "producer" & "technical director or architect"
- Either can report to the other depending on circumstances & people

"Plan to throw one away; you will, anyhow."

"The most pernicious and subtle bus are system bugs arising from mismatched
assumptions made by the authors of various components. ... Conceptual
integrity of the product not only makes it easier to use, it also makes it
easier to build and less subject to bugs."

Jul 22 '05 #96

P: n/a
er******************@hp.com (Eric Hamilton) writes:
Thanks for a stimulating topic.

I heartily agree that Mythical Man Month is essential reading for
anyone who wants to understand large scale software projects.

The other essential on my book case is Lakos' "Large Scale C++
Software Design". It's applicable to any language and has enough
rationale that's grounded in real development practices and the
problems of large scale projects that I think it's relevant to the
original topic.

A few years ago, I happened to reread Brooks and wrote up a
collection his insights that resonated with me. I've attached it
below in hopes of whetting the appetite of anyone who hasn't already
read it and as a reminder for those who haven't reread it recently.
I encourage everyone to (re)read the full book.


one of boyd's observation about general US large corporations starting
at least in the 70s was rigid, non-agile, non-adaptable operations.
he traced it back to training a lot of young people received in ww2 as
how to operate large efforts (who were starting to come into positions
of authority) ... and he contrasted it to guderian and the blitzgreig.

guderian had a large body of highly skilled and experienced people
.... who he outlined general strategic objectives and left the tactical
decisions to the person on the spot .... he supposedly proclaimed
verbal orders only ... in the theory that the auditors going around
after the fact would not find a paper trail to blaim anybody when
battle execution had glitches. the theory was the the trade-off of
letting experierenced people on the spot feel free to make decisions
w/o repercusions, more than offset any possibility that that they
might make mistakes.

boyd contrasted this with the much less experienced american army with
few really experienced people which was structured for heavy top-down
direction (to take advantage of skill scarcity) ... the rigid top-down
direction with little local autonomy would rely on logistics and
managing huge resource advantage (in some cases 10:1).

part of the issue is that rigid, top-down operations is used to manage
large pools of unskilled resources. on the other hand, rigid top-down
operations can negate any advantage of skilled resource pool (since
they will typically be prevented from exercising their own judgement).

so in the guderian scenario .... you are able to lay out strategic
objectives and then allow a great deal of autonomy in achieving
tactical objectives (given a sufficent skill pool and clear strategic
direction).

random boyd refs:
http://www.garlic.com/~lynn/subboyd.html#boyd
http://www.garlic.com/~lynn/subboyd.html#boyd2

--
Anne & Lynn Wheeler | http://www.garlic.com/~lynn/
Jul 22 '05 #97

P: n/a
er******************@hp.com (Eric Hamilton) writes:
Harlan Mills proposed "surgical team" approach. [Not applicable everywhere.]

Conceptual integrity:
- Analogy to architectural unity of Reims cathedral vs. others that were
"improved" inconsistently.

- "I will contend that conceptual integrity is the most important
consideration in system design. It is better to have a system omit
certain anomalous features and improvements, but to reflect one set of
design ideas, than to have one that contains many good but independent
and uncoordinated ideas."

- The purpose of a programming system is to make a computer easy to
use. [We may modify purpose to be to make it easy to do the things that
our customers need done.]
- Ratio of function to conceptual complexity is the ultimate test of
system design.
- For a given level of function, that system is best in which one can
specify things with the most simplicity and straightforwardness.


i was at a talk that harlan gave at the 1970 se symposium ... that
year it was held in DC (which was easy for harlan since he was local
in fsd) ... close to the river on the virginia side (marriott? near a
bridge ... I have recollections of playing hooky one day and walking
across the bridge to the smithsonian).

is was all about super programmer and librarian .... i think the super
programmer was re-action to the large low-skilled hordes ... and the
librarian was to take some of administrative load of the super
programmer.

i remember years later somebody explaining that managers tended to
spend 90% of their time with the 10% least productive people ... and
that 90% of the work was frequently done by the 10% most productive
people; it was unlikely that anything that a manager did was going to
significantly improve the 10% least productive members .... however if
they spent 90% of their time helping remove obstacles from the 10%
most productive ... and even if that only improved things by 10%
.... that would be the most benefical thing that they could do. This
was sort of the librarian analogy from harlan ... that managers
weren't there to tell the high skilled people what to do ... managers
were to facilitate and remove obstacles from their most productive
people.

this is somewhat more consistant with one of boyd's talks on the
organic design for command and control.

--
Anne & Lynn Wheeler | http://www.garlic.com/~lynn/
Jul 22 '05 #98

P: n/a
Anne & Lynn Wheeler <ly**@garlic.com> writes:
i was at a talk that harlan gave at the 1970 se symposium ... that
year it was held in DC (which was easy for harlan since he was local
in fsd) ... close to the river on the virginia side (marriott? near
a bridge ... I have recollections of playing hooky one day and
walking across the bridge to the smithsonian).


this marriott has bug'ed my memory across some period of posts
http://www.garlic.com/~lynn/2000b.html#20 How many Megaflops and when?
http://www.garlic.com/~lynn/2000b.html#24 How many Megaflops and when?
http://www.garlic.com/~lynn/2000b.html#25 How many Megaflops and when?
http://www.garlic.com/~lynn/2000c.html#64 Does the word "mainframe" still have a meaning?
http://www.garlic.com/~lynn/2001h.html#48 Whom Do Programmers Admire Now???
http://www.garlic.com/~lynn/2002i.html#49 CDC6600 - just how powerful a machine was it?
http://www.garlic.com/~lynn/2002q.html#51 windows office xp
http://www.garlic.com/~lynn/2003g.html#2 Share in DC: was somethin' else
http://www.garlic.com/~lynn/2003k.html#40 Share lunch/dinner?
http://www.garlic.com/~lynn/2004k.html#25 Timeless Classics of Software Engineering

so doing some searching ... this is a picture of approx. what i remember
http://www.hostmarriott.com/ourcompa...?page=timeline

this lists a ieee conference at twin bridge marriott, washington dc in '69
http://www.ecs.umass.edu/temp/GRSS_History/Sect6_1.html

this lists first marriott motor hotel, twin bridges, washington dc
http://www.hrm.uh.edu/?PageID=185

and this has a reference to the site of the former Twin Bridges Marriott
having been razed several years ago
http://www.washingtonpost.com/wp-srv...ve/crystal.htm

--
Anne & Lynn Wheeler | http://www.garlic.com/~lynn/
Jul 22 '05 #99

P: n/a
apm
Anne & Lynn Wheeler <ly**@garlic.com> wrote in message news:<u1***********@mail.comcast.net>...
The other essential on my book case is Lakos' "Large Scale C++
Software Design". It's applicable to any language and has enough
rationale that's grounded in real development practices and the
problems of large scale projects that I think it's relevant to the
original topic.


I am suprised you say it's applicable to any language. The advice
about cyclic dependencies certainly is but I find much of the advice
is specific to C++, such as #includes, use of fwd class decls etc. But
that's ok coz it is supposed to be for C++ developers.

IMO it is a classic but the evolution of C++ has caused it to become
somewhat dated. That is why I give it a qualified recommendation. The
main ideas are certainly important and AFAIK are not covered by any
other text. And there is some pioneering work, such as metrics and
notation for physical dependencies. However, at the time of writing
(circa 1997), many commercial C++ compilers on Unix were quite limited
in what they supported and this is what makes me say the book is
dated. The compiler limitations made certain language features
off-limits in the interests of portability. Features such as heavy
template use (e.g meta-programming) and namespaces were of very
limited availability. So the way these features are handled would
probably not be in line with current usage. For more information on
this, see the comments made in passing on the ACCU web site
(http://www.accu.org/htdig/search.htm).

Regards,

Andrew Marlow
Jul 22 '05 #100

102 Replies

This discussion thread is closed

Replies have been disabled for this discussion.