By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,481 Members | 1,211 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,481 IT Pros & Developers. It's quick & easy.

lies about OOP

P: n/a
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.

Jul 18 '05 #1
Share this Question
Share on Google+
75 Replies


P: n/a
Try comp.object.

John Roth

"projecktzero" <pr**********@yahoo.com> wrote in message
news:11*********************@c13g2000cwb.googlegro ups.com...
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.


Jul 18 '05 #2

P: n/a
It goes something like this (re-hashed a little):

"Every program of any complexity written in a procedural language will have a
[half-assed] implementation of object oriented design."

On Monday 13 December 2004 07:33 pm, projecktzero wrote:
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.


--
James Stroud, Ph.D.
UCLA-DOE Institute for Genomics and Proteomics
611 Charles E. Young Dr. S.
MBI 205, UCLA 951570
Los Angeles CA 90095-1570
http://www.jamesstroud.com/
Jul 18 '05 #3

P: n/a
projecktzero wrote:
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.


https://www.tundraware.com/Technology/Bullet/

--
----------------------------------------------------------------------------
Tim Daneliuk tu****@tundraware.com
PGP Key: http://www.tundraware.com/PGP/
Jul 18 '05 #4

P: n/a
In article <11*********************@c13g2000cwb.googlegroups. com>,
projecktzero <pr**********@yahoo.com> wrote:
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way.


In the world of computers, the statement "X is slower than Y" is true
for almost every value of X and Y under some circumstances.

IMHO, "loves perl" doesn't mesh with either "old school" or "cares
about overhead", but that's just me.

Alan
--
Defendit numerus
Jul 18 '05 #5

P: n/a
projecktzero wrote:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly.


The question in your first paragraph is largely answered (albeit
indirectly) by your second. You are doing web programming. It's
highly unlikely that you currently are near your limits in terms
of either "overhead" (I'll take that to mean memory usage) or
performance, and you are almost certainly limited by bandwidth.

In other words, you're I/O bound and not CPU or memory bound, so
any fuzzy concerns about the supposed sluggishness of OOP code
are seriously misplaced.

If I'm wrong, and your company has only just been surviving in
the market, solely by virtue of the incredibly quick and
lightweight code crafted by your wizardly but dated co-worker,
then I'll happily go to work disproving his ludicrous claim.

Until then, it's hardly worth the discussion... a clear case
of premature optimization, and in this case costing your
company huge benefits in lowered maintenance costs, higher
code quality, greater reuse, access to more up-to-date programmers
than your co-worker ;-) and so on.

-Peter
Jul 18 '05 #6

P: n/a
On Mon, 13 Dec 2004 19:33:25 -0800, projecktzero wrote:
We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.


OO is a huge, ginormous, amazingly large, unspeakably awesome,
can't-believe-anyone-ever-lived-without-it win... but not necessarily OO
as it is presented in Software Engineering class due to the unusual nature
of web programming.

Tell him to check out HTML::Mason, and be sure to work with it long enough
to actually use some of its features. Once he's hooked (and it really is
an awesome framework; Amazon is supposed to use it and while I'm sure it
is highly customized I can definitely see it), explain to him that the
various components are really objects, complete with quite a lot of the
object features like inheritance, even if it doesn't look it.

If he resists this and declares Mason to be "crap", then with all due
respect you've got a blowhard who refuses to learn on your hands, and in a
perfect world he'd be stripped of code responsibility and moved somewhere
where he can't hurt anything. (He may merely not like it; I reserve the
strong statements in the previous sentence for the case where he actually
dismisses it with prejudice.) In the meantime, I've had great luck in Perl
environments programming in OO anyhow, as long as you have reasonably
independent responsibilities, and eventually the advantages do not go
unnoticed. Perl gets bashed on around here (in a good natured way) but
there are certainly worse languages; generally when I want to do something
the Right Way it provides a way to avoid code duplication, though it is
usually more circuitous and complex than in Python.

Ultimately, of course, the true problem isn't that you aren't coding OO,
it is the use of Copy and Paste Programming. OO is one path out, but only
one. (Perl is strong enough that one can make a case for going functional,
though I strongly prefer a functional/OO hybrid that builds on OO but
freely borrows functional paradigms at will.)

http://www.c2.com/cgi/wiki?CopyAndPasteProgramming

The problem with web programming is that you can *get away with*
"procedural" programming because the partitioning of the problem into web
pages provides a primitive, but marginally effective partitioning of the
problem into discrete components. Thus, unless you are running
*everything* through the exact same "web page" (CGI script probably in
this case), you aren't doing true procedural; the CGI scripts function as
primitive objects themselves, enough to let you get farther than you could
in a monolithic program and fool yourself into thinking you're safe, but
not strong enough to build a large-scale system with high-quality code
(i.e., low duplication).

But you still suffer.

ObPython (serious though): Which Python framework is the most Mason like?
(I'm more interested in the component infrastructure than the exact syntax
of the files; I'm not so worried about embedding Python into the HTML. I
think it might be Zope but I haven't tried enough of them to know.)
Jul 18 '05 #7

P: n/a

"projecktzero" <pr**********@yahoo.com> wrote in message
news:11*********************@c13g2000cwb.googlegro ups.com...
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.


Just how old *is* his school? I saw the light in the 70's. For those of
you too young to remember, those were menacing and sinister days, when pant
legs were too wide at the bottom, and the grotesque evil of "top down
programming" was on the land. But by '86, the Joy of OOP was widely known.
Flowers bloomed and birds chirped. Pant legs narrowed. I believe that was
the year I attended the first C++ conference in Santa Fe.

Jul 18 '05 #8

P: n/a
On Mon, 2004-12-13 at 22:33, projecktzero wrote:
I know this might not be the correct group to post this, but I thought
I'd start here.

A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

We do web programming. I suspect that OO apps would behave as good as
procedural apps, and you'd get the benefit of code reuse if you do it
properly. Code reuse now consists of cutting and pasting followed by
enough modification that I wonder if it was worth it to cut and paste
in the first place.

Thanks.


Code reuse is not copying and pasting. This truly misses what code can
be. Code isn't, well shouldn't be, a static entity that written once
and forgotten. It is gradually enhanced, generalized, factored,
improved, optimized, rethought, etc etc.

A Properly Written (tm) application will have each abstract concept
implemented just once; in a properly written application a single change
is propagated throughout the system. In what you describe, a change
entails hunting the code you have pasted and changing it in a number of
locations. Depending on the size of your program and how badly your
application begs for code reuse, you can find yourself changing your
code in hundreds of places just to change a single data structure.

Seriously, ever put off changing an array to a linked list, a list to a
map, or some other similar change simply because you don't want to the
coding and testing? In a proper OOP application, different parts of
your program will *ask* for some abstract task to be performed, but only
one small part will actually deal with the details of doing it. Change
that and nothing else knows any better.

The "overhead" of OOPLs is bogus. C++ was explicitly designed so that
each and every OO operation was as fast as or faster than faking it in
C. Do you use structures in C with special functions to act on them?
Then you are already using objectish methods ... only proper C++ object
methods will be no slower, but a good deal cleaner.

Even in instances where this is the case, for instance, comparing early
smalltalk interpreters to your friendly C compiler, it is almost always
the case that the expressive power and abstraction of an OOPL allows for
the use of greater algorithmic sophistication. So, sure, your C linked
list searches might beat my Smalltalk linked list search, but in the
same amount of programmer time I'd be able to implement something
better.

I really don't care to prove my point, only to point out that if your
assertion that this individual does not understand OOP is true, then he
his point likely isn't coming from knowledge and experience, but fear of
the unknown.

Now, if you said that your co-worker was old school and into functional
programming, I'd agree to disagree and point out functional programmings
weaknesses with respect to complexity and the ability to partition
knowledge.

Forget goggle. Go to Amazon and get some texts on OOPL. Learn C++,
Java, Python for that matter. Practice casting problems as classes in
Python and submit them here for praise and criticism.

Lastly, Perl is an OOPl in its own right ... like Python and quite
unlike Java, it doesn't jam its OOP-ness down your throat.
Adam DePrince
Jul 18 '05 #9

P: n/a
On Mon, 13 Dec 2004 19:33:25 -0800, projecktzero wrote:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?


Oh, he's probably telling the truth, in that unless you have the type of
an object defined at run time then a straight procedural call is going to
be quicker, because classic "procedural" code has a very tight mapping to
the underlying hardware.

Of course, the issue is not about raw speed - which in many cases does not
matter (and the few where it does you can work around) ; it's about
maintainability, modularity and so on.

I once worked at a place (this would be mid 1980s) where the other coders
would not accept that it was "better" to use names for subroutines such as
CalculateBillingTotal or variables such as StaffName. The argument was
"well, gosub 13000 and S$ are the same thing" .... which misses the point.

If he's that obsessed speed what is he doing coding with Perl (hell I like
Perl) which is compiled to a bytecode which is then interpreted.... why
not code in 'C' or even Assembler, then it'll be really quick ? Answer ;
you like the facilities of the language. So it is a trade off.

Jul 18 '05 #10

P: n/a
"Jive" <so*****@microsoft.com> wrote in message
news:Re********************@news.easynews.com...
<snip> But by '86, the Joy of OOP was widely known.


"Widely known"? Errr? In 1986, "object-oriented" programming was barely
marketing-speak. Computing hardware in the mid-80's just wasn't up to the
task of dealing with OO memory and "messaging" overhead. Apple Macs were
still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David
Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be
legitimized to many management decision makers, that it was more than just
"fairy dust" (as Bill Gates had characterized it in an attempt to discredit
Borland's forays into the field).

I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."

-- Paul
Jul 18 '05 #11

P: n/a
projecktzero wrote:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?


Sounds like your co-worker has a major case of premature optimization. I don't
know about speed issues with OO, but for large projects, using OOP makes data
encapsulation so much easier. Writing correct code with minimum effort should be
the first goal, speed issues (at that level) should be brought into the game
later on.

You should ask your co-worker if he also puts all his data in global variables :)

*wink*

--
Timo Virkkala
Jul 18 '05 #12

P: n/a
Paul McGuire wrote:
"Jive" <so*****@microsoft.com> wrote in message
news:Re********************@news.easynews.com...

<snip>
But by '86, the Joy of OOP was widely known.
"Widely known"? Errr? In 1986, "object-oriented" programming was barely
marketing-speak. Computing hardware in the mid-80's just wasn't up to the
task of dealing with OO memory and "messaging" overhead. Apple Macs were
still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David
Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be
legitimized to many management decision makers, that it was more than just
"fairy dust" (as Bill Gates had characterized it in an attempt to discredit
Borland's forays into the field).


In my view THAT byte article on Smalltalk in the early '80 was the
beginning.

Then came Brad Cox's book.

Then there was Glockenspiel's C++ for PC in about '87 or '88. And, of
course, cfont on unix from about, what, '85?

Across the late '80s there was, of course, Eiffel which seemed a
remarkable piece of work for the time. And was backed by a terrific book
by Myer.

Then it all seemed to take off once C++ version 2.0 was minted.

I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."


In actual fact, virtually all the design patterns came from the
Interviews C++ GUI toolkit written in the early '90s. What an utterly
brilliant piece of work that was.

--
Mike


Jul 18 '05 #13

P: n/a

"Mike Thompson" <none.by.e-mail> wrote in message
news:41**********************@news.optusnet.com.au ...
Then came Brad Cox's book.
I read it.

Then there was Glockenspiel's C++ for PC in about '87 or '88.
I didn't PC in those days. I Unixed.
And, of course, cfont on unix from about, what, '85?
That's about when I got it. I used to chat with B.S. on the phone,
discussing and proposing features. Now he's rich and famous. Me? Would
you believe rich? How about not destitute?

Across the late '80s there was, of course, Eiffel which seemed a
remarkable piece of work for the time. And was backed by a terrific book
by Myer.


I puzzled long over whether to adopt C++ or Eiffel at the company I was with
at the time. I went with C++, dispite the fact that cfront was slow as
death and buggy. C++ made it bigtime and the company went public. Lucky
guesses? Hah!

Ah, nostalgia isn't what it used to be.

Jive
Jul 18 '05 #14

P: n/a
On Tue, 2004-12-14 at 16:02, Mike Thompson wrote:
I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."
In actual fact, virtually all the design patterns came from the
Interviews C++ GUI toolkit written in the early '90s. What an utterly
brilliant piece of work that was.


As somebody who has just been bowled over by how well Qt works, and how
it seems to make OOP in C++ work "right" (introspection, properties,
etc), I'd be interested in knowing what the similarities or lack thereof
between Qt and Interviews are.

I've been pleasantly astonished again and again by how I can write
something in C++ with Qt like I would write it in Python, and have it
just work. Alas, this doesn't extend as far as:

instance = Constructor(*args)

though if anybody knows how to do this in C++ I would be overjoyed to
hear from them. Qt _does_ provide a pleasant (if somewhat limited) of
the Python getattr() and setattr() calls.

--
Craig Ringer

Jul 18 '05 #15

P: n/a
Hello projecktzero,
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?

Try http://www.dreamsongs.com/Essays.html (search for "Objects Have Failed")
for an interesting discussion.

Bye.
--
------------------------------------------------------------------------
Miki Tebeka <mi*********@zoran.com>
http://tebeka.bizhat.com
The only difference between children and adults is the price of the toys
Jul 18 '05 #16

P: n/a
Hello,

Instead of copy and paste, I use functions for code reuse. I didn't see
the light of OOP, yet. I use Python but never did anything with OOP. I
just can't see what can be done with OOP taht can't be done with
standart procedural programing.

Jul 18 '05 #17

P: n/a
sb****@gmail.com wrote:
I just can't see what can be done with OOP taht can't be done with
standart procedural programing.


Well, there's absolutely nothing you can do with OOP that
can't be done with "standard procedural programming" (SPP).

But that's hardly the point. After all, anything you can
do with OOP or SPP can be done with assembly language as
well.

OOP is way of approaching the design and construction of
the software. As a starting point, consider the advantages
of procedural programming over using raw assembly language.

Now consider that there might be similar advantages in
using OOP instead of procedural programming.

And, lastly, to bring this on topic for this forum, consider
that there might be advantages in using *Python*, specifically,
for doing this OOP programming, compared to many other
languages. Not that you can do things in Python you can't
do in other languages (such as, say, assembly). Just that
you can do them much more easily, and the resulting code
will be much more readable to you and others.

(To be fair, for certain tasks using OOP provides basically
no advantages, and in fact might represent a more awkward
model for the code than a simple procedural program would.
If that's the sort of program you are faced with writing,
by all means stick with SPP and leave OOP to those who
write complex applications that really benefit from it.)

-Peter
Jul 18 '05 #18

P: n/a
Craig Ringer wrote:
On Tue, 2004-12-14 at 16:02, Mike Thompson wrote:

I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."


In actual fact, virtually all the design patterns came from the
Interviews C++ GUI toolkit written in the early '90s. What an utterly
brilliant piece of work that was.

As somebody who has just been bowled over by how well Qt works, and how
it seems to make OOP in C++ work "right" (introspection, properties,
etc), I'd be interested in knowing what the similarities or lack thereof
between Qt and Interviews are.


Qt provides widgets that a client app. can compose into a GUI.
InterViews provides 'glyphs'[*] that form a scene graph in a display
server. Although InterViews usually was compiled into a client-side
library, it provided all the functionality required by a display server
such as redisplay and pick traversals. Indeed, the X Consortium
supported InterViews (and its successor Fresco) for a while as the next
generation for its 'X Windowing System', until it was dropped (for
mostly political reasons, as usual) about '95.
(Fresco had been nominated, together with OpenDoc, as candidates for an
'Compound Document Architecture' RFP on the Object Management Group.
OpenDoc won.)
[*] The term 'glyph' reflexts the fact that the scene graph nodes in
InterViews are extremely fine-grained, i.e. glyphs can represent
individual characters or elements of vector graphics such as paths.
That's unlike any conventional 'toolkit' such as Qt, where a 'widget'
is quite coarse-grained, and the display of such 'widgets' is typically
not that of a structured graphic, but procedural.

Regards,
Stefan

Jul 18 '05 #19

P: n/a
Paul McGuire wrote:
"Jive" <so*****@microsoft.com> wrote in message
news:Re********************@news.easynews.com...

<snip>
But by '86, the Joy of OOP was widely known.
"Widely known"? Errr? In 1986, "object-oriented" programming was barely
marketing-speak. Computing hardware in the mid-80's just wasn't up to the
task of dealing with OO memory and "messaging" overhead. Apple Macs were
still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David
Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be
legitimized to many management decision makers, that it was more than just
"fairy dust" (as Bill Gates had characterized it in an attempt to discredit
Borland's forays into the field).

Well, that's not true either, and the fact that Bill Gates was
denigrating it implies that he at least knew about it, even if he chose
not to adopt it (then: of course nowadays Microsoft call almost all
their technologies "object oriented"; sometimes this description is as
accurate as when Gates speaks about "our open Windows environment").
I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the
Joy of OOP" would be "widely known." DP formalized a taxonomy for many of
the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient
to say that the Joy of OOP could be said to be "widely known."

We could all make our own choices, but anyone who's been programming
*seriously* since the 60s will likely remember Simula as the birth of
many oft he ideas later picked up by Alan Kay and promoted by the Xerox
PARC SmallTalk group.

I visited that group in 1981 (after Kay left, unfortunately, and then
being headed by Adele Goldberg, who is now coincidentally promoting the
delights of Python at conferences like OSCON), and object-oriented
programming was certainly something that was being taken pretty
seriously in the academic world as a potential solution to some serious
PLIT engineering problems.

The fact that it took the technology a relatively long time to appear
"in the wild", so to speak, is simply the natural maturation of any new
technology. Given that UNIX was developed in the early 1970s I'd say it
took UNIX 20 years to start becoming mainstream. But a lot of people
knew about it before it *became* mainstream, especially those who had to
place their technology bets early. The same is true of object-oriented
concepts.

I guess this is just to say that I'd dispute your contention that
SmallTalk was a curiosity - unless you define anything of interest
mostly to the academic world as a curiosity, in which case there's no
way to overcome your objection. It was the first major implementation of
an entire system based exclusively on OO programming concepts and, while
far from ideal, was a seminal precursor to today's object-oriented systems.

regards
Steve

--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #20

P: n/a
A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html

Les Hatton "Does OO sync with the way we think?", IEEE Software, 15(3),
p.46-54
"This paper argues from real data that OO based systems written in C++
appear to increase the cost of fixing defects significantly when
compared with systems written in either C or Pascal. It goes on to
suggest that at least some aspects of OO, for example inheritance, do
not fit well with the way we make mistakes."

His comments under "invited feedback" are amusing and confirm my
impression that OOP is partly (but perhaps not entirely) hype:

"I should not that this paper because it criticised OO had an unusually
turbulent review period. 2 reviewers said they would cut their throats
if it was published and 3 said the opposite. The paper was only
published if the OO community could publish a rebuttal. I found this
very amusing as my paper contains significant data. The rebuttal had
none. This sort of thing is normal in software engineering which mostly
operates in a measurement-free zone."

What papers have scientific evidence for OOP?

Paul Graham's skeptical comments on OOP are at
http://www.paulgraham.com/noop.html .

If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?

Jul 18 '05 #21

P: n/a
"Steve Holden" <st***@holdenweb.com> wrote in message
news:29Evd.32591$Jk5.26287@lakeread01...
Paul McGuire wrote:
"Jive" <so*****@microsoft.com> wrote in message
news:Re********************@news.easynews.com...

<snip>
But by '86, the Joy of OOP was widely known.
"Widely known"? Errr? In 1986, "object-oriented" programming was barely marketing-speak. Computing hardware in the mid-80's just wasn't up to the task of dealing with OO memory and "messaging" overhead. Apple Macs were still coding in C and Forth. Borland didn't ship Turbo-Pascal with
Object-Oriented programming until 1989, and Turbo-C++ shipped in 1991.
Smalltalk had been around for 10 years by 1986, but it was still a
curiosity, hardly "widely known." It wasn't until the publication of David Taylor's "Object Technology: A Manager's Guide" in 1990 that OOP began to be legitimized to many management decision makers, that it was more than just "fairy dust" (as Bill Gates had characterized it in an attempt to discredit Borland's forays into the field).

Well, that's not true either, and the fact that Bill Gates was
denigrating it implies that he at least knew about it, even if he chose
not to adopt it (then: of course nowadays Microsoft call almost all
their technologies "object oriented"; sometimes this description is as
accurate as when Gates speaks about "our open Windows environment").
I would pick the publication of "Design Patterns" in 1995 by the Gang of
Four (Gamma, Helm, Johnson, and Vlissides), to be the herald of when "the Joy of OOP" would be "widely known." DP formalized a taxonomy for many of the heuristics that had evolved only intuitively up until then. Its
emergence reflects a general maturation of concept and practice, sufficient to say that the Joy of OOP could be said to be "widely known."

We could all make our own choices, but anyone who's been programming
*seriously* since the 60s will likely remember Simula as the birth of
many oft he ideas later picked up by Alan Kay and promoted by the Xerox
PARC SmallTalk group.

I visited that group in 1981 (after Kay left, unfortunately, and then
being headed by Adele Goldberg, who is now coincidentally promoting the
delights of Python at conferences like OSCON), and object-oriented
programming was certainly something that was being taken pretty
seriously in the academic world as a potential solution to some serious
PLIT engineering problems.

The fact that it took the technology a relatively long time to appear
"in the wild", so to speak, is simply the natural maturation of any new
technology. Given that UNIX was developed in the early 1970s I'd say it
took UNIX 20 years to start becoming mainstream. But a lot of people
knew about it before it *became* mainstream, especially those who had to
place their technology bets early. The same is true of object-oriented
concepts.

I guess this is just to say that I'd dispute your contention that
SmallTalk was a curiosity - unless you define anything of interest
mostly to the academic world as a curiosity, in which case there's no
way to overcome your objection. It was the first major implementation of
an entire system based exclusively on OO programming concepts and, while
far from ideal, was a seminal precursor to today's object-oriented

systems.
regards
Steve

--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119


Good points all. And yes, I recall the BYTE article on Smalltalk. I guess
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known". He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree. This was
the year of the first OOPSLA conference, but as PyCon people know, just
having a conference doesn't guarantee that a technology is widely and
joyfully accepted. Just as my commercial-centric view may understate
academic interest in some topics, an academic-centric view may overestimate
the impact of topics that are ripe for research, or technically "cool," but
little understood or adopted outside of a university setting.

I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

(And I apologize for characterizing Smalltalk as a "curiosity." I admit my
bias is for software that is widely commercially deployed, and even the most
ardent Smalltalkers will have difficulty citing more than a handful of
applications, compared to C,C++,VB,COBOL,Delphi, etc. I personally have
seen Smalltalk-based factory control and automation systems, but they are
rapidly self-marginalizing, and new customers are extremely reluctant to
enfold Smalltalk into an already patchwork mix of technologies, as is
typically found in factory settings.)

-- Paul
Jul 18 '05 #22

P: n/a
Paul McGuire wrote:
"Steve Holden" <st***@holdenweb.com> wrote in message
news:29Evd.32591$Jk5.26287@lakeread01...
[some stuff]

Good points all. And yes, I recall the BYTE article on Smalltalk. I guess
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known". He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree. This was
the year of the first OOPSLA conference, but as PyCon people know, just
having a conference doesn't guarantee that a technology is widely and
joyfully accepted. Just as my commercial-centric view may understate
academic interest in some topics, an academic-centric view may overestimate
the impact of topics that are ripe for research, or technically "cool," but
little understood or adopted outside of a university setting.

I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

(And I apologize for characterizing Smalltalk as a "curiosity." I admit my
bias is for software that is widely commercially deployed, and even the most
ardent Smalltalkers will have difficulty citing more than a handful of
applications, compared to C,C++,VB,COBOL,Delphi, etc. I personally have
seen Smalltalk-based factory control and automation systems, but they are
rapidly self-marginalizing, and new customers are extremely reluctant to
enfold Smalltalk into an already patchwork mix of technologies, as is
typically found in factory settings.)


Nothing to disagree with here.

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #23

P: n/a
Paul McGuire wrote:
"Steve Holden" <st***@holdenweb.com> wrote in message
news:29Evd.32591$Jk5.26287@lakeread01...
[some stuff]

Good points all. And yes, I recall the BYTE article on Smalltalk. I guess
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP
was widely known". He didn't say "OOP all began when..." or "OOP was widely
known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree. This was
the year of the first OOPSLA conference, but as PyCon people know, just
having a conference doesn't guarantee that a technology is widely and
joyfully accepted. Just as my commercial-centric view may understate
academic interest in some topics, an academic-centric view may overestimate
the impact of topics that are ripe for research, or technically "cool," but
little understood or adopted outside of a university setting.

I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.

(And I apologize for characterizing Smalltalk as a "curiosity." I admit my
bias is for software that is widely commercially deployed, and even the most
ardent Smalltalkers will have difficulty citing more than a handful of
applications, compared to C,C++,VB,COBOL,Delphi, etc. I personally have
seen Smalltalk-based factory control and automation systems, but they are
rapidly self-marginalizing, and new customers are extremely reluctant to
enfold Smalltalk into an already patchwork mix of technologies, as is
typically found in factory settings.)


Nothing to disagree with here.

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #24

P: n/a
projecktzero wrote:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP.(It might be because he's in love with Perl...but that's
another story.) He thinks that OOP has more overhead and is slower than
programs written the procedural way. I poked around google, but I don't
know the magic words to put in to prove or disprove his assertion. Can
anyone point me toward some resources?


There's no magic in OOP. It's just more natural to human nature, so you
should point your friend to works of Aristotle or Sancti Thomae
Aquinatis, specially their writings on "natural sciences" and theory of
species.

--
Jarek Zgoda
http://jpa.berlios.de/ | http://www.zgodowie.org/
Jul 18 '05 #25

P: n/a
projecktzero wrote:
A co-worker considers himself "old school" in that he hasn't seen the
light of OOP ... He thinks that OOP has more overhead and is slower
than programs written the procedural way.


He may be right, but consider the alternatives.

Think of an integer. An integer is an object!

You can assign a new integer-value to the object.
You can read the integer-value of the object.
(The integer can be part of complex expressions.)

Usually you are unaware (or don't care) _how_ the object is implemented.
Wether the bits are red, green, turns upside-down or are inverted - you
doesn't really care, as long as it can hold the values that you want it to
hold and be used in the relevant contexts (addition, multiplication, ...).

Some lanugages gives you the choise of many integer-implementations, some
languages gives you only a few choises and some languages gives you only one
choise.

Surely we can agree that the presence of an integer-object is extremely
useful! If you had to do all the integer-stuff in machine code _every_ time,
you would soon be very, very tired of working with integers. There is no
doubt that objects are (or can be) extremely useful, time-saving and very
efficient. Chances are that your own machine-code integer-inplementation is
not nearly as good as the one made by a team of top-tuned programmers (no
offense) programming the integer-implementation "object".

Wether the language should give you the choise of one, some, many or
extremely many integer-implementations, depends entirely on your needs
("what a pervert - he needs an integer!"). Lowering the number of choises of
implementations, rises the chances of having to chose a "not very good"
implementation. Letting the language automaticly chose the right one, frees
your mind to other processes, but at the risk of some kind of run-time
overhead.

-------
Tomas

Jul 18 '05 #26

P: n/a

I did not really 'get' OOP until after learning Python. The relatively
simple but powerful user class model made more sense to me than C++. So
introducing someone to Python, where OOP is a choice, not a mandate, is how
*I* would introduce a procedural programmer to the subject. YMMV.

Terry J. Reedy

Jul 18 '05 #27

P: n/a
Terry Reedy <tj*****@udel.edu> wrote:
I did not really 'get' OOP until after learning Python. The
relatively simple but powerful user class model made more sense to
me than C++. So introducing someone to Python, where OOP is a
choice, not a mandate, is how *I* would introduce a procedural
programmer to the subject. YMMV.


OOP is a choice in C++ too. You can write procedural C++ code; no
need to use classes at all if you don't want to. Something like Java
is a different story. Java *forces* you to use classes. Nothing
exists in Java that's not part of some class.

I think the real reason Python is a better teaching language for
teaching OO concepts is because it just gives you the real core of OO:
inheritence, encapsulation, and association of functions with the data
they act on.

C++ has so much stuff layed on top of that (type bondage, access
control, function polymorphism, templates) that it's hard to see the
forest for the trees. You get C++ jocks who are convinced that that
stuff is part and parcel of OO, and if it doesn't have (for example),
private data, it can't be OO.
Jul 18 '05 #28

P: n/a
Paul McGuire wrote:
[snip]
I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.


IMMEDIATE NOTICE TO ALL PYTHON SECRET UNDERGROUND MEMBERS.

Classified. Any disclosure to non-PSU members prohibited. Offenders will
be apprehended and removed from the time stream, permanently.

Words in human languages typically consist of a combination of vowels
and consonants, at least up until the start of the posthumanist
revolution in 3714, when the Morning Light Confederation's ships reached
the ablethik-seganichek world of Kaupang again (on Hellenberg consensus
time streams with catalog marker AB-7). Alphabetic scripts are a typical
way to represent them. Even in the posthuman phase on Kaupang they were
widely appreciated as a quaint representation.

The language English, an indo-european tongue of the west-germanic
persuasion (expressiveness rating 7, comprehensiveness rating 12, fits
in the moderate Y group of the Lespan pan-species language
classification system), is widely in use throughout a surprisingly long
period on many time streams. This language does not have overly long
consonant combinations.

The language Dutch, though closely related to the language English has a
slightly different sound to glyph mapping system. Dutch is, of course,
the true language of the Python Secret Underground and the official
native language of Python users. In the language Dutch, a certain vowel
sound is expressed as a combination of the glyphs 'i' and 'j'. The glyph
'j' however is exclusively used for consonants in the English language,
unlike in Dutch, where 'j' serves a dual role.

Human brains used to the English language cannot cope with glyph
representations that express consonants in too long a sequence, without
any space left for vowels. A combination like 'jkstr' in the English
language is inevitably considered to be a spelling error, and corrective
procedures automatically attempt to correct the spelling of such a word
to a more acceptable combination.

This happens frequently to the name 'Dijkstra', a name that originated
in the Dutch natural language. The English eye cannot accept such a
ridiculous combination of consonants (j k s t *and* r?), and desperately
tries to resolve the situation. As a result, the glyphs 'i' and 'j'
are frequently reversed.

This is extremely unfortunate, as Djikstra is well known to be a primary
moniker for the leader of the Insulationist faction within the Gheban
coalition. The Insulationist faction is, of course, a prominent member
the alliance that produced the Alien Whitespace Eating Nanovirus.
Djikstra is therefore an enemy of the Python programming language. All
that we stand for. All our hopes. All our dreams will come to naught if
Djikstra gets his way.

The moniker Djikstra is to be avoided in public utterances. PSU members
can give themselves away and draw unwanted attention from the
Insulationist overlord at this critical junction. What's worse,
innocents might be caught up in this cosmic web of intrigue. While most
innocents can of course be safely ignored, any innocent of temporal
tension rating 17 and above (revised scale) should not be exposed to
undue danger, as they may be essential for our time stream manipulations.

It is therefore important to avoid the utterance of Djikstra's name at
all costs!

ADDENDUM FOR PSU MEMBERS OF CLASSES NE-100 AND HIGHER

The relation between Djikstra and Dijkstra's name is of course not a
coincidence. As was already evidenced in the famous "Considered Harmful"
article, the great philosopher Dijkstra was on to a monumental cosmic
secret: that reality is bound by a term rewriti
Jul 18 '05 #29

P: n/a
be*******@aol.com wrote:
A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html [snip description of paper that compares C++ versus Pascal or C]
What papers have scientific evidence for OOP?
That's of course a good question. I'm sure also that comp.object has
argued about this a thousand times. I'll just note that one paper is
just a single data point with specific circumstances. The OO languages
under investigation might have caused increased or lower failure rates
for other reasons than their (lack of) object-orientedness, for
instance. It is of course possible to come up with a lot of other
explanations for a single data point besides a conclusion that OOP can
lead to more buggy software. It for instance certainly not surprising to
me that C++ can lead to more buggy software than some other languages. :)

[snip] If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?


Because C++ is not an ideal object oriented language? Because a Linux
kernel has very stringent predictability requirements for what kind of
machine code is generated that C meets and is much harder to do with
C++? There are other reasons to choose C, such as portability, obiquity
and performance.

Some of the same reasons probably apply to Perl and Python, though at a
lesser degrees. I do not know a lot about Perl's implementation. I do
know that Guido van Rossum has in fact considered rewriting Python in
C++ in the past. And right now, there are various projects that are
using object oriented languages to reimplement Python, including Python
itself.

Finally, it is certainly possible to program in object oriented style in
C. It is more cumbersome than in a language that supports it natively,
but it is certainly possible. Such OO in C patterns occur throughout the
Linux kernel, which needs a pluggability architecture for its various
types of drivers. It can also be seen in many aspects of Python's
implementation. Another example of a C-based system that uses object
oriented technologies is the GTK+ widget set.

Anyway, this question is using a few data points to make an overly
generic argument, and the data points themselves do not really support
the argument so very well either.

Regards,

Martijn
Jul 18 '05 #30

P: n/a
"Roy Smith" <ro*@panix.com> wrote in message
news:cp**********@panix2.panix.com...
I think the real reason Python is a better teaching language for
teaching OO concepts is because it just gives you the real core of OO:
inheritence, encapsulation, and association of functions with the data
they act on.

C++ has so much stuff layed on top of that (type bondage, access
control, function polymorphism, templates) that it's hard to see the
forest for the trees. You get C++ jocks who are convinced that that
stuff is part and parcel of OO, and if it doesn't have (for example),
private data, it can't be OO.


+1, QOTW!! :) (esp. "type bondage"!)
Jul 18 '05 #31

P: n/a
"Martijn Faassen" <fa*****@infrae.com> wrote in message
news:Pc******************@amsnews05.chello.com...
Paul McGuire wrote:
[snip]
I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra published "Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.


IMMEDIATE NOTICE TO ALL PYTHON SECRET UNDERGROUND MEMBERS.

Classified. Any disclosure to non-PSU members prohibited. Offenders will
be apprehended and removed from the time stream, permanently.


<snip - it's "Dijkstra" not "Djikstra", you dolt! :)>

Yikes! (or better, "Jikes!" or even "Yijkes!"?) - my bad.
And he was on faculty at UT right here in Austin, too.

Red-faced-ly yours -
-- Paul
Jul 18 '05 #32

P: n/a
be*******@aol.com writes:
If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?


Because C++ combines the worst features of C and OO programming. It
also makes some defaults go the wrong way, and forces decisions onto
the programmer that are best left up to the compiler, as the
programmer is liable to get them wrong.

C, on the other hand, is a very nice portable assembler language. I've
seen cases where a good optimizing compiler wrote faster code than a
bright human writing assembler (though it was less readable). C is
enough liek assembler that some HLLs generate C instead of assembler,
thus making them portable. I've seen those generate C code as clean as
a human being might generate, given the write options.

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Jul 18 '05 #33

P: n/a

"Paul McGuire" <pt***@austin.rr._bogus_.com> wrote in message
news:Xg******************@fe1.texas.rr.com...
I was just reacting mostly to the OP's statement that "by '86 the Joy of OOP was widely known".
I (Jive Dadson) said that. I guess I figured that if I knew about it, it
was widely known. But in retrospect, I had an information edge. I was in
Silicon Valley, working on the Next Big Thing, and I was wired into USENET.
My earliest dejagoogle hit is from '86. (It's not under "Jive Dadson", a
more recent nom du net.)

He didn't say "OOP all began when..." or "OOP was widely known," which I think still would have been a stretch - he implied that by
'86 OOP was widely recognized as Goodness, to which I disagree.


Well, it was widely known by everyone who read the motos I stuck up on my
cubicle walls. :-)

Jive
Jul 18 '05 #34

P: n/a
projecktzero wrote:
He thinks that OOP has more overhead....


I think he's just confusing programming with marriage.
--
CARL BANKS

Jul 18 '05 #35

P: n/a
be*******@aol.com wrote:
A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html
Sure, OOP *can* lead to more buggy software, that doesn't mean it always
does.

Les Hatton "Does OO sync with the way we think?", IEEE Software, 15(3),
p.46-54
"This paper argues from real data that OO based systems written in C++
appear to increase the cost of fixing defects significantly when
compared with systems written in either C or Pascal. It goes on to
suggest that at least some aspects of OO, for example inheritance, do
not fit well with the way we make mistakes."
So, he has data that shows that C++ *appears* to increase the cost of
fixing defects, then *suggests* that its because C++ is an OO language?
Sounds like he is ignoring his own data to me...

Mr. Hatton suffers from the same problem that many OO critics suffer. He
thinks that the language choice decides whether the program written is
an OO program. I've seen plenty of very non-OO systems written in OO
languages, I've seen expert OO systems written in non-OO languages. OOP
isn't a language choice, it is a style of problem solving.

I'm happy to accept that it could take longer to fix bugs in programs
written in C++ when compared to either C or Pascal, the language itself
is quite a bit more complicated than either of the latter.

You know, it tends to take longer to repair a 2004 Mustang than it does
a 1964 Mustang, does that mean the newer car is not as good?

If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?


All three of the systems in question were begun before C++ was
standardized. Python was also implemented in Java, does that mean OO
other than C++ is good? Of course not, the fact that the three projects
in question were implemented in C is not an indictment against OO in any
way.
Jul 18 '05 #36

P: n/a
Everyone keep moving. There is nothing to see here. Move along.

Jul 18 '05 #37

P: n/a
"sb****@gmail.com" <sb****@gmail.com> writes:
Instead of copy and paste, I use functions for code reuse. I didn't see
the light of OOP, yet. I use Python but never did anything with OOP. I
just can't see what can be done with OOP taht can't be done with
standart procedural programing.


There are cases where functions just don't do the job.

I at one time needed to use the little-used (and probably little-know)
"account" featre of an ftp server to automate a regular file
transfer. Both Perl and Python come with ftp modules. Perl's was (is?)
procedural, Python's is OO. Neither supported the account feature.

Now, if I used perl to do this, I'd have to either modify the module
in place, meaning I'd have to remember to put the mods back every time
we updated perl if I couldn't get them to buy the patch, or I could
make a local copy of the module, meaning it wouldn't get any bug fixes
that might come with new versions of perl.

With the Python version, I created a subclass of the FTP connection
module, rewrote just the login method, and installed that locally. Now
I don't have to worry about installing new versions of Python, as my
code is outside the distribution. But I still get the benefit of any
bug fixes that show up outside the login method. I also submitted the
new login method, and it's now part of the standard module.

This kind of code reuse just isn't possible with procedural code.

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Jul 18 '05 #38

P: n/a
Mike Meyer wrote:
If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?


Because C++ combines the worst features of C and OO programming. It
also makes some defaults go the wrong way, and forces decisions onto
the programmer that are best left up to the compiler, as the
programmer is liable to get them wrong.


that's a nice rant about C++, but it's not the right answer to the question. the
Python core developers are perfectly capable of writing working C++ code,
and both the core and most extensions would benefit from C++ features (just
look at Boost and other C++ layers).

but C++ wasn't a serious contender back when CPython development was
started, and nobody's going to convert the existing codebase...

</F>

Jul 18 '05 #39

P: n/a
Daniel T. wrote:
be*******@aol.com wrote:

A paper finding that OOP can lead to more buggy software is at
http://www.leshatton.org/IEEE_Soft_98a.html

Sure, OOP *can* lead to more buggy software, that doesn't mean it always
does.


I think that costs(=time) to develop and maintain software depends not
on wheter it is based on OOP or not but on two factors:

* Number of NEW Code lines to solve the given problem
* Complexity of this new code

The anwser to the question if OOP is better is: it depends

If the given problem is solved with less code and complexity in OOP then
it is the better approach if not the reverse is true.

Thats why I like python because it does not force to use OOP or
procedural programming.

But isnt the main argument to code in Python (or other high level
languages) easy of use and compact code?

Therefore should Python code be less buggy and cheaper to develop and
maintain. Are there any papers on that?

--
Greg
Jul 18 '05 #40

P: n/a
Paul McGuire wrote:
"Martijn Faassen" <fa*****@infrae.com> wrote in message
news:Pc******************@amsnews05.chello.com...
Paul McGuire wrote:
[snip]
I would characterize the 80's as the transitional decade from structured
programming (which really started to hit its stride when Djikstra
published
"Use of GOTO Considered Harmful") to OOP, and that OOP wasn't really
"joyful" until the early-to-mid 90's.


IMMEDIATE NOTICE TO ALL PYTHON SECRET UNDERGROUND MEMBERS.

Classified. Any disclosure to non-PSU members prohibited. Offenders will
be apprehended and removed from the time stream, permanently.


<snip - it's "Dijkstra" not "Djikstra", you dolt! :)>

Yikes! (or better, "Jikes!" or even "Yijkes!"?) - my bad.
And he was on faculty at UT right here in Austin, too.


It's a very common mistake I've seen so often that for a while I
wondered whether his name really *was* Djikstra, but I knew he was Dutch
and that it couldn't be so. That the PSU picked you for its disclosure
is just a random coincidence, I'm sure.. :)

Regards,

Martijn

Jul 18 '05 #41

P: n/a
Daniel T. wrote:
Mr. Hatton suffers from the same problem that many OO critics suffer.
He thinks that the language choice decides whether the program
written is an OO program. I've seen plenty of very non-OO systems
written in OO languages, I've seen expert OO systems written in
non-OO languages. OOP isn't a language choice, it is a style of
problem solving.


And he suffers from the common misunderstanding that OO is better because
"it matches the way we think about the world". It wouldn't suprise me if
design created with that reasoning in mind is more costly than a good non-OO
design.

If you use the features of an OOPL, most importantly polymorphism, to manage
the dependencies in your code, and you do have some experience doing so, I'm
quite certain that it reduces "corrective maintenance cost".

Take care, Ilja
Jul 18 '05 #42

P: n/a
Martijn Faassen wrote:
Paul McGuire wrote:
"Martijn Faassen" <fa*****@infrae.com> wrote in message
<snip - it's "Dijkstra" not "Djikstra", you dolt! :)>

Yikes! (or better, "Jikes!" or even "Yijkes!"?) - my bad.
And he was on faculty at UT right here in Austin, too.


It's a very common mistake I've seen so often that for a while I
wondered whether his name really *was* Djikstra, but I knew he was Dutch
and that it couldn't be so. That the PSU picked you for its disclosure
is just a random coincidence, I'm sure.. :)


Well, in any case, thanks for setting the record straight, Martjin.
Jul 18 '05 #43

P: n/a
Responding to Beliavsky...
Les Hatton "Does OO sync with the way we think?", IEEE Software, 15(3),
p.46-54
"This paper argues from real data that OO based systems written in C++
appear to increase the cost of fixing defects significantly when
compared with systems written in either C or Pascal. It goes on to
suggest that at least some aspects of OO, for example inheritance, do
not fit well with the way we make mistakes."
Try and find and experienced OO developer who would advocate that large,
complex generalizations are a good practice. You can write lousy
programs in any paradigm. The likelihood increases when you use the
most technically deficient of all the OOPLs. (If those developers had
used Smalltalk, I'll bet their defect rates would have been
substantially lower even if they weren't very good OO developers.)

His comments under "invited feedback" are amusing and confirm my
impression that OOP is partly (but perhaps not entirely) hype:

"I should not that this paper because it criticised OO had an unusually
turbulent review period. 2 reviewers said they would cut their throats
if it was published and 3 said the opposite. The paper was only
published if the OO community could publish a rebuttal. I found this
very amusing as my paper contains significant data. The rebuttal had
none. This sort of thing is normal in software engineering which mostly
operates in a measurement-free zone."
Part of that criticism was that his experiments were uncontrolled.
There was no attempt made to ensure that the programs under either
paradigm were of high quality for the paradigm. There were other
experimental issues, such as the scale of the programs, that make the
data anecdotal at best and a stacked deck at worst.

What papers have scientific evidence for OOP?
I don't know of any large, controlled studies but there must be some
buried in PhD theses somewhere. There is substantial anecdotal evidence
to the contrary, though. For example, where I worked before retiring we
ran a number of experiments to determine whether we should adopt OO
development. One experiment was for exactly the same MLOC application
(originally written in BLISS) that was rewritten in C and then in C++
using good OOA/D. The same developers, who were domain experts, were
used. Both C and C++ were new languages for most of them. [While they
were proficient at procedural development, OO was OJT. However,
extensive OOA/D training was provided.]

The initial development times were about the same, probably due to the
OO learning curve. The released defect rates for the OO version were
about 1/2 those of the C version. The big difference, though, was in
maintenance time, which was nearly an order of magnitude less for the OO
version. One of the first major rounds we estimated to take 6
engineering months using the established techniques we used for
procedural development (which were accurate to -5/+15%). Three people
turned the changes on the OO version in a week -- to the amazement of
everyone, including ourselves. Besides hard comparative data on time
spent, the permanent maintenance staff for the application dropped from
8 full-timers for the C version to one guy half-time for the OO version.

While this is anecdotal (among other things, it is dependent on the
OOA/D methodology employed), it was done with a whole lot more control
than Hatton's experiments. [We were a very process-oriented shop that
insisted on hard experimental data before instituting any process
change. We also collected data religiously. Not a lot of shops can
tell you immediately how much time the average developer spends in
meetings or the average time it takes to diagnose a memory over-write
problem. B-)]

Paul Graham's skeptical comments on OOP are at
http://www.paulgraham.com/noop.html .

If OOP is so beneficial for large projects, why are the Linux kernel,
the interpreters for Perl and Python, and most compilers I know written
in C rather than C++?


The main reason is performance. All those examples are very performance
sensitive. C will be ~twice as fast as C++ and C++'s design compromises
were made specifically to enhance its performance! In addition,
physical coupling is a much bigger problem in the OOPLs than in
procedural languages, so build time can become an issue for larger
applications.

[The OO translation-based approaches that do full code generation from
OOA models usually target straight C as the implementation language for
performance sensitive situations in R-T/E. (Also, translation code
generators can usually generate OOPL source code faster than it can be
compiled, but that is usually not true for C.)]
*************
There is nothing wrong with me that could
not be cured by a capful of Drano.

H. S. Lahman
hs*@pathfindermda.com
Pathfinder Solutions -- Put MDA to Work
http://www.pathfindermda.com
blog (under constr): http://pathfinderpeople.blogs.com/hslahman
(888)-OOA-PATH

Jul 18 '05 #44

P: n/a
Isn't there a comp.lang.flame or something?
Jul 18 '05 #45

P: n/a
Peter Hansen wrote:
Martijn Faassen wrote:
Paul McGuire wrote:
"Martijn Faassen" <fa*****@infrae.com> wrote in message
<snip - it's "Dijkstra" not "Djikstra", you dolt! :)>

Yikes! (or better, "Jikes!" or even "Yijkes!"?) - my bad.
And he was on faculty at UT right here in Austin, too.


It's a very common mistake I've seen so often that for a while I
wondered whether his name really *was* Djikstra, but I knew he was
Dutch and that it couldn't be so. That the PSU picked you for its
disclosure is just a random coincidence, I'm sure.. :)


Well, in any case, thanks for setting the record straight, Martjin.


That of course also happens to me once every while. I can take care of
myself though -- Dijkstra however needs an advocate for the correct
spelling of his name in this earthly realm.

Imagine, for instance, what if he wants to egosurf, google for his own
name and finds nothing because everybody was saying Djikstra all the
time? That'd be terrible! What, they don't have google in the eternal
realm? How can it be valhalla without google? Impossible.

Regards,

Martijn
Jul 18 '05 #46

P: n/a
Jive wrote:
Isn't there a comp.lang.flame or something?


I've doublechecked, but I didn't see any significant flaming in this
article (and I'm generally not very tolerant of it). My PSU posting was
certainly not intended as a flame, in case that was misinterpreted.

What'd I miss?

Regards,

Martijn

Jul 18 '05 #47

P: n/a
"Jive" <so*****@microsoft.com> wrote in message
news:lr**********************@news.easynews.com...
Everyone keep moving. There is nothing to see here. Move along.


You wish! If only ending a thread were that easy - we wouldn't be hearing
about "Why is Python slower than Java?" anymore!

But I agree with Martijn (note spelling) - I didn't notice any flames in the
thread, it seemed civil enough to me.

-- Paul
Jul 18 '05 #48

P: n/a
Martijn Faassen wrote:
Peter Hansen wrote:
Well, in any case, thanks for setting the record straight, Martjin.


That of course also happens to me once every while. I can take care of
myself though -- Dijkstra however needs an advocate for the correct
spelling of his name in this earthly realm.


Then there's us Danes, with "sen" instead of "son" (as many people
think it ought to be). And I can't even claim the wrong form
sounds noticably different, making any defense seem petty.

(Darn those Norwegians, influencing people's ideas of how a
name like Hansen ought to be spelled, grumble, grumble.
If they'd just invent a cell phone that used Python, as the
Swedish have, they might deserve all that extra attention.)

;-)

-Peter
Jul 18 '05 #49

P: n/a
Martijn Faassen wrote:
Jive wrote:
Isn't there a comp.lang.flame or something?


I've doublechecked, but I didn't see any significant flaming in this
article (and I'm generally not very tolerant of it). My PSU posting was
certainly not intended as a flame, in case that was misinterpreted.

What'd I miss?


Has the PSU checked on the whereabouts of the time machine lately?

Maybe Jive took it. I did hear it had gone missing.

I could easily see this thread descending into a flame war in,
oh, about another ten posts. That would be so freaky...

-Peter
Jul 18 '05 #50

75 Replies

This discussion thread is closed

Replies have been disabled for this discussion.