By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,997 Members | 1,270 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,997 IT Pros & Developers. It's quick & easy.

Can a low-level programmer learn OOP?

P: n/a
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?

The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.

Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.

But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?

Ultimately I don't care what the *name* is for how I program. I just
need to produce results. So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.

Problem:

1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.

2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.

Possible solutions:

Form 1: Use C and choose a library that will enable cross-platform GUI
development.

Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.

Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

Form 3: Use LabVIEW

Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python. In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.
Comments appreciated.
--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 13 '07 #1
Share this Question
Share on Google+
65 Replies


P: n/a
On Fri, 13 Jul 2007 09:06:44 -0700, Chris Carlen wrote:
Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.
That's not something tied to OOP. Automatic memory management is also
possible with procedural languages.
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?
Creating classes and organizing the program in an OOP language isn't
different from creating structs and organizing the program in C.

On one side Python is a very OOP language as everything is an object. On
the other side it is possible to write parts of the program in procedural
or even functional style. Python is not Java, you don't have to force
everything into classes.

From my experience Python makes it easy to "just write the code". Easier
than C because I don't have to deal with so much machine details, don't
have to manage memory, don't need extra indexes for looping over lists and
so on. And the "crashes" are much gentler, telling me what the error is
and where instead of a simple "segfault" or totally messed up results.

Ciao,
Marc 'BlackJack' Rintsch
Jul 13 '07 #2

P: n/a
Chris Carlen wrote:
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?

The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
Why?

I've written extensively in C++, including hard real-time programming
in C++ under QNX for a DARPA Grand Challenge vehicle. I have an Atmel
AVR with a cable plugged into the JTAG port sitting on my desk right now.
Even that little thing can be programmed in C++.

You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.

If it has state and functions, it probably should be an object.
The instances of the object can be static in C++; dynamic memory
allocation isn't required in C++, as it is in Python.

Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though, so if you're trying
to implement, say, a rapidly changing digital oscilloscope display,
the result may be sluggish.

John Nagle
Jul 13 '07 #3

P: n/a
Chris,

I can fully relate to your post. I trained as a programmer in the 80s
when OOP was an accademic novelty, and didn't learn OOP untill around
2002. However now I find myself naturaly thinking in OOP terms,
although I'm by no means an expert - I'm a sysadmin that writes the
occasional utility. I found learning OOP with Python very easy because
it has such a stripped-down and convenient syntax.

The advantages of OOP aren't in performance or memory, they're in the
fact that OOP simplifies the ways in which we can think about and
solve a problem. OOP packages up the functionality of a program into
logical units (objects) which can be written, debugged and maintained
independently of the rest of the programme almost as if they were
completely seperate programmes of their own, with their own data and
'user inteface' in the form of callable functions (actualy methods).

Here's a realy excellent tutorial on Python that's fun to follow.
Downloading and installing python, and following this tutorial will
probably take about as long as it took to write your post in the first
place. At the end of it you'll have a good idea how OOP works, and how
Python works. Learning OOp this way is easy and painless, and what you
learn about the theory and principles of OOP in Python will be
transferable to C++ if you end up going in that direction.

I hope this was helpful.

Simon Hibbs
Jul 13 '07 #4

P: n/a

Sorry, here's the tutorial link:

http://hetland.org/writing/instant-python.html
Simon Hibbs

Jul 13 '07 #5

P: n/a
John Nagle wrote:
Chris Carlen wrote:[edit]
>Hence, being a hardware designer rather than a computer scientist, I
am conditioned to think like a machine. I think this is the main
reason why OOP has always repelled me.

Why?
When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.
I've written extensively in C++, including hard real-time programming
in C++ under QNX for a DARPA Grand Challenge vehicle.
Did the vehicle win?
I have an Atmel
AVR with a cable plugged into the JTAG port sitting on my desk right now.
Even that little thing can be programmed in C++.
Yes.
You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.
That's interesting. But why is this any different than using
preprocessor macros in C?
>
If it has state and functions, it probably should be an object.
The instances of the object can be static in C++; dynamic memory
allocation isn't required in C++, as it is in Python.
Why? Why is OOP any better at explaining a state machine to a computer?
I can write state machines all over the place in C, which tend to be
the core of most of my embedded programs. I can write them with
hardcoded logic if that seems like the easy thing to do any the
probability of extensive changes is extremely low. They are extremely
easy to read and to code. I have written a table-driven state machine
with arbitrary-length input condition lists. The work was all in
designing the data structures. The code to update the state machine was
about 4 lines.

Why would OOP be better? Different is not better. Popular is not
better. What the academics say is not better. Less lines of code might
be better, if the priority is ease of programming. Or, less machine
execution time or memory usage might be better, if that is the priority.

Until I can clearly understand why one or the other of those goals might
better be realized for a given problem with OOP vs. procedures, I just
don't get it.

I will keep an open mind however, that until I work with it for some
time there is still the possibility that I will have some light go on
about OOP. So don't worry, I'm not rejecting your input.
Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though, so if you're trying
to implement, say, a rapidly changing digital oscilloscope display,
the result may be sluggish.
Yes, I certainly wouldn't consider Python for that.

Thanks for your comments.
--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 13 '07 #6

P: n/a
On 7/13/07, John Nagle <na***@animats.comwrote:
You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.
This is a bit off topic, but inline is a keyword in C since C99.

--
Evan Klitzke <ev**@yelp.com>
Jul 13 '07 #7

P: n/a
On 2007-07-13, Chris Carlen <cr***************@BOGUSsandia.govwrote:
John Nagle wrote:
>You can sometimes get better performance in C++ than in C,
because C++ has "inline". Inline expansion happens before
optimization, so you can have abstractions that cost nothing.

That's interesting. But why is this any different than using
preprocessor macros in C?
This is OT, however: inline functions have a few benefits over
preprocessor macros.

1. They are type-safe.
2. They never evaluate their arguments more than once.
3. They don't require protective parentheses to avoid precedence errors.
4. In C++, they have the additional benefit of being defined in a
namespace, rather than applying globally to a file.

As an experienced C programmer you're probably used to coping
with the problems of preprocessor macros, and may even take
advantage of their untyped nature occasionally. Even C++
programmers still use the advisedly.
I will keep an open mind however, that until I work with it for
some time there is still the possibility that I will have some
light go on about OOP. So don't worry, I'm not rejecting your
input.
In my opinion OOP is usefully thought of as a type of design
rather than a means of implementation. You can implement an OO
design in a procedural langauge just fine, but presumably an OO
programming language facilitates the implementation of an OO
design better than does a procedural language.

Going back to the stack machine question, and using it as an
example: Assume you design your program as a state machine.
Wouldn't it be easier to implement in a (hypothetical)
state-machine-based programming language than in a procedural
one? I think John was insinuating that a state-machine is more
like an object than it is like a procedure.

--
Neil Cerutti
Jul 13 '07 #8

P: n/a
Chris Carlen a écrit :
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?
None. Definitively. wrt/ developper time and memory, it's mostly a
matter of fit-your-brains. If it does, you'll find it easier, else
choose another programming style. wrt/ cpu time and memory, and using
'low-level' languages (C/C++/Pascal etc) OO is usually worse than
procedural for simple programs. For more complex ones, I'd say it tends
to converge since these programs, when written procedurally, usually
rely on many abstraction/indirection layers.
The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontrollers in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency.
You may still want to have a look on some more functional languages like
Haskell, OCaml or Erlang. But if you find OO alien, I doubt you'll have
a strong feeling for functional programming.
The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.
Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation.
While OO without automatic memory management can quickly become a major
PITA, OO and GC are two orthogonal concepts - some languages have
builtin support for OO but nothing specific for memory management
(ObjectPascal, C++, ObjectiveC), and some non-OO languages do have
builtin memory management (mostly but not only in the functional camp).
If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.
It's not of feature of OO per se. But it's clear that not having (too
much) to worry about memory management greatly enhance productivity.
But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?
Don't you design your programs ? AFAICT, correct design is not easier
with procedural programming.

Now to answer your question, I'd say it depends on your experience of
OO, and of course of the kind of OO language you're using. With
declaratively statically typed languages - like C++, Java etc - you are
forced into a lot of upfront design (way too much IMHO). Dynamic
languages like Smalltalk, Python or Ruby are much more lightweight in
this area, and tend to favor a much more exploratory style - sketch a
quick draft on a napkin, start coding, and evolve the design while
you're coding.

And FWIW, Python doesn't *force* you into OO - while you'll be *using*
objects, you can write most of your code in a procedural way, and only
"fall down" into OO for some very advanced stuff.
Ultimately I don't care what the *name* is for how I program. I just
need to produce results.
Indeed !-)
So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.

Problem:

1. How to most easily learn to write simple PC GUI programs
GUI are one of the best (and more successfull) application of OO - and
as a matter of fact, even GUI toolkits implemented in plain C tend to
take an OO approach (GTK+ being a clear example, but even the old
Pascal/C Mac GUI API does have a somewhat "object based" feeling).
that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
So what you want is an hi-level, easy to learn language with a rich
collection of libraries. The Goodnews(tm) is that Python is one of the
possible answers.
2. Must be cross-platform: Linux + Windows.
Idem. You can even add most unices and MacOS X to the list.
This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.

Possible solutions:

Form 1: Use C and choose a library that will enable cross-platform GUI
development.

Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.
Con: C is a low-level language (not a criticism - it has been designed
so), which greatly impact productivity.
Con: the only serious C (not++) cross-platform GUI toolkit I know is
GTK+, which is less cross-platform than wxWidgets, and *is* OO.
Form 2: Use Python and PySerial and TkInter or wxWidgets.
I'd probably go for wxWidgets.
Pro: Cross-platform goal will likely be achieved fully.
Very likely. There are a couple of things to take care of, but nothing
close to what you'd have to do in C.
Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library.
Yes, obviously. The (other) GoodNews(tm) is that, according to most
estimations, an experimented programmer can become productive in Python
in a matter of weeks at worst (some manage to become productive in a few
days). This won't mean you'll master the language and use it at its
best, but don't worry, you'll get things done, and perhaps in less time
than with C.
Must possibly learn a
completely new way of thinking (OOP)
Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -, it
doesn't *force* you into OO (IOW : you don't have to define classes to
write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.
not just a new language syntax.
You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.
This might be difficult.
Not necessarily that much.
Form 3: Use LabVIEW

Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python.
I don't have much knowledge of LabVIEW so I can't comment on this. But I
remember a thread here about G, and I guess you'll find Python much more
familiar - even if you'll need some 'thinking adjustment' to grok it.
In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.
IMHO, the biggest gain (in learning Python vs LabVIEW) is that you'll
add a very valuable tool to your toolbox - the missing link between C
and shell scripts.
>
Comments appreciated.
HTH
Jul 13 '07 #9

P: n/a
Chris Carlen a écrit :
(snip)
>
Why? Why is OOP any better at explaining a state machine to a computer?
I don't know if it's "better", but state machines are the historical
starting point of OO with the Simula language.
I can write state machines all over the place in C,
And even in assembler - so why use C ?-)
which tend to be
the core of most of my embedded programs. I can write them with
hardcoded logic if that seems like the easy thing to do any the
probability of extensive changes is extremely low. They are extremely
easy to read and to code. I have written a table-driven state machine
with arbitrary-length input condition lists. The work was all in
designing the data structures.
Which is another approach to OO. When programming in C, you do use
structs, don't you ? And you do write functions operating on instances
of these structs ? And possibly, turn these structs into ADT ? Well, one
possible definition of "objects" is "ADT + polymorphism".
Why would OOP be better?
Whoever pretend it's absolutely "better" should be shot down. I do find
OO *easier* than pure procedural programming, but I started programming
with mostly OO (or at least object-based) languages, and only then
learned pure procedural languages (and then bits of functional
programming). It's not a matter of being "better", it's a matter of what
style fits your brain. If OO doesn't fit your brain, then it certainly
won't be "better" *for you*.
Different is not better. Popular is not
better. What the academics say is not better. Less lines of code might
be better, if the priority is ease of programming.
and maintenance, and robustness (AFAICT, the defect/LOC ratio is
somewhat constant whatever the language, so the less code the less bugs).
Or, less machine
execution time or memory usage might be better, if that is the priority.
Indeed.
Until I can clearly understand why one or the other of those goals might
better be realized for a given problem with OOP vs. procedures, I just
don't get it.
Seems quite sane.
I will keep an open mind however, that until I work with it for some
time there is still the possibility that I will have some light go on
about OOP. So don't worry, I'm not rejecting your input.
> Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though,
This is a very simplistic - and as such, debatable - assertion IMHO. On
my Linux box, a cat-like program is hardly faster in C than in Python
(obviously since such a program is IO bound, and both implementations
will use the native IO libs), and for quite a few computation-heavy
tasks, there are Python bindings to highly optimised C (or C++) libs. So
while it's clear that Python is not about raw execution speed, it's
usually quite correct for most applicative tasks. And when it isn't,
well, it's always possible to recode the critical parts in Pyrex or C.
Jul 13 '07 #10

P: n/a
On Jul 13, 1:05 pm, Chris Carlen <crcarleRemoveT...@BOGUSsandia.gov>
wrote:
John Nagle wrote:
Chris Carlen wrote:[edit]
Hence, being a hardware designer rather than a computer scientist, I
am conditioned to think like a machine. I think this is the main
reason why OOP has always repelled me.
Why?

When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.
What can this do for me that I can do already with the procedural way
of thinking? Absolutely nothing; it's all Turing machines under the
hood.

Why rearrange my thinking to a new terminology? Because new
terminologies matter a lot. There's nothing that you can do with
pointers that can't be done with arrays; I know because I wrote a lot
of FORTRAN 77 code back in the day, and withouy pointers I had to
write my own memory allocation routines that worked off of a really
big array.

Likewise, there's nothing that you can do in C that can't be done with
C++ (especially since C++ was originally a preprocessor for C);
however C++ will keep track of a lot of low-level detail for you so
you don't have to think about it. Let's say that you have an embedded
single-board computer with a serial and a parallel port. You probably
have two different routines that you use to talk to them, and you have
to always keep track which you are using at any given time.

It's a lot easier to have a single CommPort virtual class that you use
in all of your code, and then have two sub-classes, one for serial
ports and one for parallel. You'll be especially happy for this when
someone decides that as well as logging trace information to a
printer, it would be nice to also log it to a technician's handhelp
diagnostic device.

Jul 13 '07 #11

P: n/a
Neil Cerutti wrote:
Going back to the stack machine question, and using it as an
example: Assume you design your program as a state machine.
Wouldn't it be easier to implement in a (hypothetical)
state-machine-based programming language than in a procedural
one? I think John was insinuating that a state-machine is more
like an object than it is like a procedure.
I think at this point, I should stop questioning and just learn for a while.

But regarding state machines, I had probably written a few in C the past
before really understanding that it was a state machine. Much later I
grasped state machines from digital logic. Then it became much clearer
how to use them as a tool and to code them intentionally.

Once I have written a state table, I can implement using flip-flops and
gates or in C as either a state variable and a switch statement or
something table driven. The switch code can be written as fast as I can
read through the state table. That's the easiest implementation, but
the least easy to change later unless it's fairly small.

I will be eager to see how to do this in Python.

I have found the comments in response to my doubts about OOP very
encouraging. I will do some learning, and come back when I have more
Python specific problems...

Thanks for the input!
--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 13 '07 #12

P: n/a
Simon Hibbs wrote:
Sorry, here's the tutorial link:

http://hetland.org/writing/instant-python.html
Simon Hibbs

Thanks Simon. Actually, that's the tutorial that I've started with.

Your comments are encouraging. I'll keep learning.
--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 13 '07 #13

P: n/a
Bruno Desthuilliers wrote:
Chris Carlen a écrit :
[edit]
> Must possibly learn a completely new way of thinking (OOP)

Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -, it
doesn't *force* you into OO (IOW : you don't have to define classes to
write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.
>not just a new language syntax.

You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.
I see. That's very promising. I guess some articles I read painted a
picture of religiousity among OOP programmers. But that is not the
impression I am getting at all on the street.
IMHO, the biggest gain (in learning Python vs LabVIEW) is that you'll
add a very valuable tool to your toolbox - the missing link between C
and shell scripts.

Thanks for the comments!

--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 13 '07 #14

P: n/a
Chris Carlen a écrit :
Bruno Desthuilliers wrote:
>Chris Carlen a écrit :
>[edit]
>> Must possibly learn a completely new way of thinking (OOP)


Not necessarly. While Python is OO all the way down - meaning that
everything you'll work with will be an object (functions included) -,
it doesn't *force* you into OO (IOW : you don't have to define classes
to write a Python program). You can as well use a procedural - or even
somewhat functional - approach, and most Python programs I've seen so
far are usually a mix of the three.
>>not just a new language syntax.


You forgot one of the most important part of a language : idioms. And
it's definitively *not* idiomatic in Python to use classes when a
simpler solution (using plain functions and modules) is enough.


I see. That's very promising. I guess some articles I read painted a
picture of religiousity among OOP programmers.
That's alas a common disease - I'd say the best way to be definitively
disgusted from OO is to read comp.lang.object :(
But that is not the
impression I am getting at all on the street.
Heck. As you said, the important is to get things done. And I guess
that's why we all (here) love Python. Last time I had to work on a
Pascal program (actually Delphi's ObjectPascal, but the whole thing was
almost caricaturally procedural), I found myself having to write tens of
lines of code for thing that would have been no-brainer one-liners in
Python, and define new types (records - Pascal's structs) where Python's
builtin dict type would have do the trick. It's not a matter of
procedural vs OO vs functional, it's a matter of using the appropriate
tool for the job.
Jul 13 '07 #15

P: n/a
On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
<bd*****************@free.quelquepart.frwrote:
>Chris Carlen a écrit :
>Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.
=== 8< ===
>>
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.

OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.
Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
adherents everywhere!

As a few more enlightened have noted in more than one thread here, the
Mother of All OOP was Simula (then known as SIMULA 67). All Alan Kay
did was define "OOPL", but then didn't notice (apparently--though this
may have been a "convenient oversight") that Simula satisfied all the
criteria so was actually the first OOPL--and at least 10 years earlier
than Smalltalk!

So Kay actually invented NONE of the concepts that make a PL an OOPL.
He only stated the concepts concisely and named the result OOP, and
invented yet another implementation of the concepts-- based on a
LISP-like functional syntax instead of an Algol-60 procedural syntax,
and using message-passing for communication amongst objects (and
assumed a GUI-based IDE) (and introduced some new terminology,
especially use of the term "method" to distinguish class and instance
procedures and functions, which Simula hadn't done) .

As Randy Gest notes on http://www.smalltalk.org/alankay.html, "The
major ideas in Smalltalk are generally credited to Alan Kay with many
roots in Simula, LISP and SketchPad." Too many seem to assume that
some of these other "features" of Smalltalk are part of the definition
of an OOP, and so are misled into believing the claim that it was the
first OOPL. Or they claim that certain deficiencies in Simula's object
model--as compared to Smalltalk's--somehow disqualifies it as a "true
OOPL", even though it satisfies all the criteria as stated by Kay in
his definition. See http://en.wikipedia.org/wiki/Simula and related
pages, and "The History of Programming Languages I (HOPL I)", for
more details.

Under a claim of Academic Impunity (or was that "Immunity"), here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate of Alan Kay and
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===
Jul 13 '07 #16

P: n/a
In article <f7********@news4.newsguy.com>,
Chris Carlen <cr***************@BOGUSsandia.govwrote:
>
From what I've read of OOP, I don't get it.
For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.
>The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988.
Newbie. ;-)

(I started with BASIC in 1976.)
>Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.
My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.

Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.
--
Aahz (aa**@pythoncraft.com) <* http://www.pythoncraft.com/

I support the RKAB
Jul 13 '07 #17

P: n/a
Chris Carlen wrote:
John Nagle wrote:
>Chris Carlen wrote:[edit]
>>Hence, being a hardware designer rather than a computer scientist, I
am conditioned to think like a machine. I think this is the main
reason why OOP has always repelled me.

Why?

When pointers were first explined to me, I went "Ok." And rather
quickly ideas lit up in my head about what I could do with them.

When I read what OOP is, that doesn't happen. All I think is "what's
the point of this?" "What can this do for me that I can do already with
the procedural way of thinking?" And if it can't do anything new, then
why rearrange my thinking to a new terminology? It's results that
matter, not the paradigm.
I have been programming since 1978. I started off with BASIC, learned
Assembly and Pascal, and much later eventually moved on to Javascript,
Perl, and PHP. All of my work was done procedurally.

Recently, I have been working on a very large project involving a lot of
OO-Javascript. For what we are doing on the project, OO makes sense. I
really didn't get OOP until working on this project - probably because I
never did anything that really needed it.

I have found myself leaning more toward the OO paradigm since doing
this, after 25+ years of procedural programming, and now I find myself
doing more work with OO concepts, and getting things done even faster,
and with less work, than I used to.

But I still have a problem with STRICT OOP - which is why I like Python.
Use OO where it's useful, use procedural when that works best.

I suspect that the reason it isn't clicking for you is twofold: 1) You
don't do anything currently that has an obvious need for OOP, and 2) You
haven't done anything with OOP.

A couple ideas:

1) Maybe you can try building a relatively trivial program that would
more naturally use an OO methodology - perhaps a simple videogame like
Pac-man? The 'monsters' would be objects, with properties such as color,
X-position, Y-position, etc. - make yourself work in OO terms

2) This may seem silly, but download & play with "Scratch"
(http://scratch.mit.edu) - it's basically an introduction to programming
for kids, but it's completely OO, and super easy to use. It might be
useful to help you to see the 'grand view' better.

3) Give in to the dark side :)

Good luck - after so much time invested in one way of thinking, it's not
easy to change.
Jul 14 '07 #18

P: n/a
Aahz wrote:
In article <f7********@news4.newsguy.com>,
Chris Carlen <cr***************@BOGUSsandia.govwrote:
>>From what I've read of OOP, I don't get it.

For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.
>>The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988.

Newbie. ;-)

(I started with BASIC in 1976.)
Heh heh, I actually first programmed when the RadioShack TRS-80 came
out. I think I saw it first in 1978 when I was 11. I would hang out in
the store for hours writing crude video games.
My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.

Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.

Thanks for the input!
--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 14 '07 #19

P: n/a
On 2007-07-13, Wayne Brehaut <wb******@mcsnet.cawrote:
So Kay actually invented NONE of the concepts that make a PL an
OOPL. He only stated the concepts concisely and named the
result OOP,
Naming and categorizing something shouldn't be underestimated as
an accomplishment, though. The exercise can have profound
results. For example, consider "marriage." ;)
Under a claim of Academic Impunity (or was that "Immunity"),
here's another historical tid-bit. In a previous empolyment we
once had a faculty applicant from CalTech who knew we were
using Simula as our introductory and core language in our CS
program, so he visited Xerox PARC before coming for his
inteview. His estimate of Alan Kay and Smalltalk at that time
(early 80s) was that "They wanted to implement Simula but
didn't understand it--so they invented Smalltalk and now don't
understand _it_!"
Heh, heh. Thanks for the intersting info.

--
Neil Cerutti
Jul 14 '07 #20

P: n/a
Aahz wrote:
In article <f7********@news4.newsguy.com>,
Chris Carlen <cr***************@BOGUSsandia.govwrote:
>>From what I've read of OOP, I don't get it.

For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.
>The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988.

Newbie. ;-)

(I started with BASIC in 1976.)
Newbie ;-)

(I started with Algol 60 in 1967).
>Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.
I used to write in C for the SunView platform (back in the days when the
GUI was integrated into the kernel as the only way to get acceptable
speed on the display). From what I remember, "Hello World" took about 40
lines.

The immense (relatively speaking: this was 1985) size of the libraries
required was one of the primary justifications for implementing shared
libraries.
Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.
That's very true. I still use a lot of (perhaps too much) procedural
coding, but driving the object-oriented libraries is a great way for a
noob to get started in OOP.

regards
Steve
--
Steve Holden +1 571 484 6266 +1 800 494 3119
Holden Web LLC/Ltd http://www.holdenweb.com
Skype: holdenweb http://del.icio.us/steve.holden
--------------- Asciimercial ------------------
Get on the web: Blog, lens and tag the Internet
Many services currently offer free registration
----------- Thank You for Reading -------------

Jul 14 '07 #21

P: n/a
On 7/13/07, John Nagle <na***@animats.comwrote:
You can sometimes get better performance in C++ than in C, because C++
has "inline". Inline expansion happens before optimization, so you
can have abstractions that cost nothing.
C99 has that too.
Python is a relatively easy language, easier than C++, Java,
or even Perl. It's quite forgiving. The main implementation,
CPython, is about 60x slower than C, though, so if you're trying
to implement, say, a rapidly changing digital oscilloscope display,
the result may be sluggish.
But if the data for that oscilloscope comes from an external device
connected via a serial port, execution speed won't matter.
--
mvh Björn
Jul 14 '07 #22

P: n/a
Chris Carlen <cr***************@BOGUSsandia.govwrote:
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.
OOP can be abused (particularly with deep or intricate inheritance
structures). But the base concept is simple and clear: you can bundle
state and behavior into a stateful "black box" (of which you may make as
many instances, with independent state but equal behavior, as you need).
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
I'm an MS in EE (minoring in computer engineering) by training (over a
quarter century ago:-); I "slid" into programming kind of inexhorably
(fate obviously wanted me to:-) but paradigms such as OOP (and
functional programming, but that's another subject) always made sense to
me *in direct analogy to my main discipline*. A JK flip-flop and a D
flip-flop are objects with 1-bit states and different behavior; I may
put in my circuit as many (e.g.) J-K flip-flops as I need, and each will
have separate state, even though each will have identical behavior (how
it responds to signals on the J and K lines). I don't need to think
about how a J-K flip-flop is *made*, inside; I use it as a basic
component in designing richer circuits (well, I did back when I DID
design circuits, but I haven't _totally_ forgotten:-). I do know how to
make one in terms of transistors, should I ever need to (well, maybe I'd
have to look it up, but I _used_ to know:-), but such a need is unlikely
to arise, because it's likely to be there as a basic component in
whatever design library I'm supposed to use for this IC.

Components much richer than J-K flip-flops are obviously more common
nowadays, but remember my real-world experience designing chips is from
the early '80s;-). Nevertheless the concept of a "bundle of state and
behavior" is still there -- and a direct, immediate analogy to OOP.
(Functional programming, OTOH, is analogous to stateless input-output
transformation circuits, an even more basic concept in HW design:-). If
anything, it's the concept of "procedural programming" that has no
direct equivalent in HW design (unless you consider microcode "HW", and,
personally, I don't;-). [[Fortunately as a part of the CE minor I did
learn Fortran, Lisp and Pascal, and a few machine-languages too, so I
wasn't totally blown away when I found myself earning a living by
programming rather than by designing chips, but that's another
story:-)]]
Alex
Jul 14 '07 #23

P: n/a
Chris Carlen wrote:
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP.
I've also found articles critical of Darwinism--but we can chalk that up
to religious zealotry can't we?

Any gui more complicated than a few entry fields and some checkbuttons
is going to lend itself to OOP--so if you want to do GUI, learn OOP. The
time you spend learning OOP will be about 1/10th the time required to
debug a modestly complicated gui. This is especially true of guis that
require real-time feedback behavior.

If you just want to enter some values and set some flags and then hit
"go", you could always program the GUI in HTML and have a cgi script
process the result. This has a lot of benefits that are frequently
overlooked but tend to be less fun than using a bona-fide toolkit like
WX or QT.

James
Jul 14 '07 #24

P: n/a
"Aahz" <aahz@pyt...aft.comwrote:
Newbie. ;-)

(I started with BASIC in 1976.)
*grinz @ Newbie*

I was writing COBOL and NEAT/3 in 1968...

- Hendrik

Jul 14 '07 #25

P: n/a
On Jul 14, 8:49 am, James Stroud <jstr...@mbi.ucla.eduwrote:
>
Any gui more complicated than a few entry fields and some checkbuttons
is going to lend itself to OOP--so if you want to do GUI, learn OOP.
Yep, there is nothing to be added to that. Except maybe that if you
don't care
too much about the look&feel you may consider starting with Tkinter.
Pros:

1. it is part of the standard library, and you already have it;
2. it is possibly the easiest/simplest GUI out there;
3. it runs pretty much everywhere with minimal fuss.

Michele Simionato

Jul 14 '07 #26

P: n/a
[Chris Carlen]
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these
articles.
If you want to know the truth, and opt to neither trust a friend or
colleague, nor spend the time to try it yourself, here's a third way:

Compile Qt (a toolkit like wx or Tk) and watch the list of source file
names scroll past. Beautiful! Perhaps there's some better way of
doing GUIs, but watching that list of source files, one realises that
that's an academic question: practically, OOP fits GUIs -- and much of
the other code in Qt -- so well, and so much effort has been put into
these GUI toolkit libraries, that one would be a fool not to use them
right now. A somewhat separate issue: You'd also be a fool not to
apply OOP to the GUI code *you* write *using* one of those OOP GUI
toolkits. Though you won't learn that all at once or without
conscious effort, that's not an obstacle with Python -- you can start
small.

Of course there's some level of experience / project size / project
longevity / number of people involved below which dashing it off using
what you know right now will be quicker, but the break-even point is
not far off in your case, I think.
[chris]
However, those articles were no more objective than the descriptions
of OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be
solved more quickly by the programmer, or that the solution is more
efficient in execution time/memory usage when implemented via OOP
vs. procedural programming?
[bruno]
None. Definitively. wrt/ developper time and memory, it's mostly a
matter of fit-your-brains. If it does, you'll find it easier, else
[...]

How do we have confidence that that's true without doing experiments?
AFAIK, only a few such experiments have been done (not counting
research that does not meet basic standards of competence or is not
peer-reviewed).

I think some programming techniques are simply better than others for
certain tasks, even when including the variation in people's abilities
(but excluding the cost of people learning those techniques, which can
of course be significant). Of course, measurement is tricky because
of differences between programmers, but it's not impossible.
John
Jul 14 '07 #27

P: n/a
aa**@pythoncraft.com (Aahz) writes:
[...]
Note very very carefully that Python does not require an OOP style of
programming,
agree

but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.
There's some truth to this. But stagnation is also very easy to
achieve, without conscious effort to improve.

Also, reading OOP books (and this list) is still beneficial, both
before and after you've understood each concept: before because it
helps to learn new concepts at a faster rate, and to learn concepts
you'd otherwise miss; after because it helps "clean up" and extend
your understanding and because it teaches you standard names for
things, helping communication.
John
Jul 14 '07 #28

P: n/a
On 7/14/07, Alex Martelli <al***@mac.comwrote:
>
OOP can be abused (particularly with deep or intricate inheritance
structures). But the base concept is simple and clear: you can bundle
state and behavior into a stateful "black box" (of which you may make as
many instances, with independent state but equal behavior, as you need).
Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
of Object Orientation in which he called the 'base concept' of 'bundle
of state and behavior' as 'object based' programming and
'object-oriented' as object-based + inheritance.

What Alex is saying is (in effect) that object-based is simple and
clear (and useful) whereas the object-orientation is subject to abuse.

This anyway is my experience: C++ programmers are distinctly poorer
programmers than C programmers -- for some strange reason closeness to
the machine has a salutary effect whereas the encouragment of
uselessly over-engineered programs makes worse programmers.

GUI is one of those cases wherein inheritance actually helps people
produce better code but this is something of an exception.

And even here one of the most widely used popularisers of GUIs has
been VB which was (at least initially) not object-oriented. VB shows
that language orientation -- tailoring 'the language' of drag-n-drop
to GUI-building and not just GUI-use -- wins over OOP.
Ruby/Rails is another example of language-oriented programming though
I feel it goes too far in (de)capitalizing, pluralizing,
(de)hyphenizing etc towards 'readability'.
[Sorry if this offends some people -- just my view!]

And this makes me wonder: It seems that Tkinter, wxpython, pygtk etc
are so much more popular among pythonistas than glade, dabo etc.

Why is this?
Jul 14 '07 #29

P: n/a
On Fri, 13 Jul 2007 20:37:04 -0400, Steve Holden <st***@holdenweb.com>
wrote:
>Aahz wrote:
>In article <f7********@news4.newsguy.com>,
Chris Carlen <cr***************@BOGUSsandia.govwrote:
>>>From what I've read of OOP, I don't get it.

For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.
>>The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988.

Newbie. ;-)

(I started with BASIC in 1976.)
Newbie ;-)

(I started with Algol 60 in 1967).
Newbie ;-)

(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

Going---going---
>>Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

My experience is that learning GUI programming is difficult. Moreover,
GUI programming in C involves a lot of boilerplate that can be automated
more easily with Python. So I think this will be a better solution.
I used to write in C for the SunView platform (back in the days when the
GUI was integrated into the kernel as the only way to get acceptable
speed on the display). From what I remember, "Hello World" took about 40
lines.

The immense (relatively speaking: this was 1985) size of the libraries
required was one of the primary justifications for implementing shared
libraries.
>Note very very carefully that Python does not require an OOP style of
programming, but it will almost certainly be the case that you just
naturally start using OOP techniques as you learn Python.

That's very true. I still use a lot of (perhaps too much) procedural
coding, but driving the object-oriented libraries is a great way for a
noob to get started in OOP.

regards
Steve
Jul 14 '07 #30

P: n/a
quoth the Wayne Brehaut:
(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

Going---going---
Mel? Is that you?

http://www.pbm.com/~lindahl/mel.html

-d
--
darren kirby :: Part of the problem since 1976 :: http://badcomputer.org
"...the number of UNIX installations has grown to 10, with more expected..."
- Dennis Ritchie and Ken Thompson, June 1972
Jul 14 '07 #31

P: n/a
On Sat, 14 Jul 2007 19:18:05 +0530, "Rustom Mody"
<ru*********@gmail.comwrote:
>On 7/14/07, Alex Martelli <al***@mac.comwrote:
>>
OOP can be abused (particularly with deep or intricate inheritance
structures). But the base concept is simple and clear: you can bundle
state and behavior into a stateful "black box" (of which you may make as
many instances, with independent state but equal behavior, as you need).

Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
of Object Orientation in which he called the 'base concept' of 'bundle
of state and behavior' as 'object based' programming and
'object-oriented' as object-based + inheritance.
Not quite--according to him:

object-based + classes =class-based
class-based + class inheritance =object-oriented

I.e., "object-oriented = objects + classes + inheritance".

This was not the, by then, standard definition: to be OO would require
all four of:

1. modularity (class-based? object-based?)
2. inheritance (sub-classing)
3. encapsulation (information hiding)
4. polymorphism ((sub-) class-specific response to a message, or
processing of a method)

Unfortunately, most of the "definitions" (usually just hand-waving,
loosey-goosey descriptions) found on the web include none--or only one
or two--of these fundamental requirements by name, and are so loose
that almost any proramming paradigm or style would be OO.
>What Alex is saying is (in effect) that object-based is simple and
clear (and useful) whereas the object-orientation is subject to abuse.
But OO is also simple and clear (if clearly defined and explained and
illustrated and implemented), and ANY programming style is subject to
abuse. During the hey-day of Pascal as an introductory programming
language (as often misused as more than that) I found many often
spent much of their time defining the data types their program would
use.
>This anyway is my experience: C++ programmers are distinctly poorer
programmers than C programmers -- for some strange reason closeness to
the machine has a salutary effect whereas the encouragment of
uselessly over-engineered programs makes worse programmers.
But this is a tautology: "over-engineered" programs are, by definition
or terminology, not a good thing--independent of what PL or style
they're finally implemented in (assuming that by "engineering" you
mean "design" or similar). Many of my Pascal students over-engineered
their solutions to simple problems too?
>GUI is one of those cases wherein inheritance actually helps people
produce better code but this is something of an exception.
This seems to imply that the list of applications you have in mind or
have worked on includes fewer domains that might profit from full OO
instead of just OB. My guess is that there are many application
domains in which analysts and programmers often think in an "OO way",
but implement in just an OB way because of the PL they or their
employer requires or prefers: in some--perhaps many--of these cases
they have to do "manually" what OO would have automated.

There is a problem, though, of (especially university and college)
education and training in OOP "talking about" how glorious OO is, and
requiring students to use OO techniques whether they're most
appropriate or not (the "classes early" pedagogical mindset). And
this problem is compounded by teaching introductory programming using
a language like Java that requires one to use an OO style for even
trivial programs. And by using one of the many very similar
introductory texbooks that talk a lot about OO before actually getting
started on programming, so students don't realize how trivial a
program is required to solve a trivial problem, and hence look for
complexity everywhere--whether it exists or not--and spend a lot of
time supposedly reducing the complexity of an already simple problem
and its method of solution.

But as I noted above, a similar problem occurred with the crop of
students who first learned Pascal: they often spent much of their time
defining the data types their program would use, just as OO
(especially "classes early") graduates tend to waste time
"over-subclassing" and developing libraries of little-used classes.

The answer is always balance, and having an extensive enough toolkit
that one is not forced or encouraged to apply a programming model that
isn't appropriate and doesn't help in any way (including
maintainability). And starting with a language that doesn't brainwash
one into believing that the style it enforces or implies is always the
best--and texbooks that teach proper choice of programming style
instead of rigid adherence to one.

wwwayne
Jul 14 '07 #32

P: n/a
On Sat, 14 Jul 2007 11:49:48 -0600, darren kirby
<bu******@badcomputer.orgwrote:
>quoth the Wayne Brehaut:
>(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

Going---going---

Mel? Is that you?

http://www.pbm.com/~lindahl/mel.html
Ha-ha! Thanks for that!

Although I'm not Mel, the first program I saw running on the LGP-30
was his Blackjack program! In 1958 I took a Numerical Methods course
at the University of Saskatchewan, and we got to program Newton's
forward difference method for the LGP-30. Our "computer centre tour"
was to the attic of the Physics building, where their LGP-30 was
networked to a similar one at the Univeristy of Toronto (the first
educational computer network in Canada!), and the operator played a
few hands of Blackjack with the operator there to illustrate how
useful computers could be.

A few years later, as a telecommunications officer in the RCAF, I
helped design (but never got to teach :-( ) a course in LGP-30
architecture and programming using both ML and ACT IV AL, complete
with paper tape input and Charactron Tube
(http://en.wikipedia.org/wiki/Charactron) output--handy, since this
display was also used in the SAGE system.

We weren't encouraged to use card games as examples, so used
navigational and tracking problems involving fairly simple
trigonometry.

wwwayne
>-d
Jul 14 '07 #33

P: n/a
On 7/13/07, Simon Hibbs <si*********@gmail.comwrote:
place. At the end of it you'll have a good idea how OOP works, and how
Python works. Learning OOp this way is easy and painless, and what you
...

But this tutorial states "I assume you know how object-oriented
programming works"

--
Sebastián Bassi (セバスティアン)
Diplomado en Ciencia y TecnologÃ*a.
GPG Fingerprint: 9470 0980 620D ABFC BE63 A4A4 A3DE C97D 8422 D43D
Jul 15 '07 #34

P: n/a
On Jul 13, 3:20 pm, Wayne Brehaut <wbreh...@mcsnet.cawrote:
On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers

<bdesth.quelquech...@free.quelquepart.frwrote:
Chris Carlen a écrit :
Hi:
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC,Alan Kay, and Smalltalk
adherents everywhere!

As a few more enlightened have noted in more than one thread here, the
Mother of All OOP was Simula (then known as SIMULA 67). AllAlan Kay
did was define "OOPL", but then didn't notice (apparently--though this
may have been a "convenient oversight") that Simula satisfied all the
criteria so was actually the first OOPL--and at least 10 years earlier
than Smalltalk!

So Kay actually invented NONE of the concepts that make a PL an OOPL.
He only stated the concepts concisely and named the result OOP, and
invented yet another implementation of the concepts-- based on a
LISP-like functional syntax instead of an Algol-60 procedural syntax,
and using message-passing for communication amongst objects (and
assumed a GUI-based IDE) (and introduced some new terminology,
especially use of the term "method" to distinguish class and instance
procedures and functions, which Simula hadn't done) .

As Randy Gest notes onhttp://www.smalltalk.org/alankay.html, "The
major ideas in Smalltalk are generally credited toAlan Kaywith many
roots in Simula, LISP and SketchPad." Too many seem to assume that
some of these other "features" of Smalltalk are part of the definition
of an OOP, and so are misled into believing the claim that it was the
first OOPL. Or they claim that certain deficiencies in Simula's object
model--as compared to Smalltalk's--somehow disqualifies it as a "true
OOPL", even though it satisfies all the criteria as stated by Kay in
his definition. Seehttp://en.wikipedia.org/wiki/Simulaand related
pages, and "The History of Programming Languages I (HOPL I)", for
more details.

Under a claim of Academic Impunity (or was that "Immunity"), here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate ofAlan Kayand
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===
A couple of notes on this post.

Alan Kay has always publicly credited Simula as the direct inspiration
for Smalltalk, and if you know the man and his work, this implication
of taking credit for the first OOP language is not true, it is a
credit assigned to him by others, and one which he usually rights when
confronted with it.

You may be confused with the fact that "object oriented
programming"was a term which I believe was first used by Alan and his
group at PARC, so perhaps the coining of the term is what is being
referenced by others.

Perhaps I'm mistaken, but the tone of your post conveys an animosity
that did not exist between the original Smalltalk and Simula
inventors; Nygard and Kay were good friends, and admired each others'
work very much.
Bonnie MacBird
Jul 15 '07 #35

P: n/a
On Sun, 15 Jul 2007 07:47:20 -0000, bo****@macbird.com wrote:
>On Jul 13, 3:20 pm, Wayne Brehaut <wbreh...@mcsnet.cawrote:
>On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers

<bdesth.quelquech...@free.quelquepart.frwrote:
>Chris Carlen a écrit :
Hi:
> From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
=== 8< ===
>Under a claim of Academic Impunity (or was that "Immunity"), here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate ofAlan Kayand
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===

A couple of notes on this post.

Alan Kay has always publicly credited Simula as the direct inspiration
for Smalltalk, and if you know the man and his work, this implication
of taking credit for the first OOP language is not true, it is a
credit assigned to him by others, and one which he usually rights when
confronted with it.
I know this, and was perhaps a little too flippant in my all-inclusive
statement "self-serving propaganda from Xerox PARC,Alan Kay, and
Smalltalk adherents everywhere!", for which I apologize. But it was
made with humorous intent, as I had hoped the opening "Oh you
young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda..." would imply.

A more accurate and unhumorous statement of my opinion is that it is
Smalltalk adherents who know virtually nothing of the history of
OOP--and even some who do--who did and still do make such claims,
both personally and in the published literature of OOP.

And my statement about a prospective faculty member's opinion was just
that: a historical anecdote, and the expression of an early 80s
opinion by a professional CS professor and researcher in formal
semantics (which may have been part of his distrust of the Smalltalk
team's "understanding" of Smalltalk) . The opinion he expressed was
his and not my own, and I was just recording (what I thought might
be) an amusing anecdote in a context in which I thought it
appropriate: discussion of what OOP is, and after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept." I don't think my
recording it here should be construed as my opinion of either
Smalltalk or its creators (at that time or now).

As often happens in many arenas, the creator of an idea can lose
control to the flock, and many publications can get accepted if
referrees themselves don't know the facts or take care to check them
before recommending publication--which probably explains why so many
publications (especially in conference proceedings) on OOP in the 80s
and 90s completely omitted any mention of Simula: so much so that I
once intended writing a paper on "Ignorance of Simula Considered
Harmful."

On the other hand, anytyhing you may have inferred about my distaste
for those who doesn't bother to learn anything of the history of a
subject, then make false or misleading claims, and don't bother to
correct themselves when questioned, is true.
>You may be confused with the fact that "object oriented
programming"was a term which I believe was first used by Alan and his
group at PARC, so perhaps the coining of the term is what is being
referenced by others.
No, I have been at more than one CS (or related area) conference where
a Smalltalk aficionado has stated unequivocally that Kay invented OOP
and that Smalltalk was the first OOPL. The last I recall for sure was
WebNet 2000, where a (quite young) presenter on Squeak made that
statement, and was not at all certain what Simula was when I asked
whether it might actually have been the first more than 10 years
before Smalltalk 80. So his claim, and that of many others,
explicitly or implicitly, is that not only the term, but most (or all)
of the concept, and (often) the first implementation of OOP was by Kay
and his Xerox PARC team in Smalltalk 80.
>Perhaps I'm mistaken, but the tone of your post conveys an animosity
that did not exist between the original Smalltalk and Simula
inventors; Nygard and Kay were good friends, and admired each others'
work very much.
Yes, you are very much mistaken (as I note above), and appear not to
have understood the intended humorous tone of my posting.

wwwayne
>
Bonnie MacBird
Jul 15 '07 #36

P: n/a
On Jul 13, 5:06 pm, Chris Carlen <crcarleRemoveT...@BOGUSsandia.gov>
wrote:
Hi:
Christopher
>
Problem:

1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
Have you also tried looking for a cross-platform GUI program that has
a
scripting interface that you might adapt? If found then the extra
scripting needs may be reduced.

- Paddy.

Jul 15 '07 #37

P: n/a
In article <om******************@newssvr11.news.prodigy.net >,
James Stroud <js*****@mbi.ucla.eduwrote:
>
If you just want to enter some values and set some flags and then hit
"go", you could always program the GUI in HTML and have a cgi script
process the result. This has a lot of benefits that are frequently
overlooked but tend to be less fun than using a bona-fide toolkit like
WX or QT.
This is excellent advice worth emphasizing -- but then, I make my living
working on a web app. ;-)
--
Aahz (aa**@pythoncraft.com) <* http://www.pythoncraft.com/

I support the RKAB
Jul 15 '07 #38

P: n/a
Wayne Brehaut a écrit :
On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
<bd*****************@free.quelquepart.frwrote:
>Chris Carlen a écrit :
>>Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
>>Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
adherents everywhere!
Not feeling concerned.

(snip pro-simula/anti-Xerox propaganda).
Jul 16 '07 #39

P: n/a
Wayne Brehaut a écrit :
(snip)
after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept."
Please reread more carefully the above. I do give credit to Smalltalk's
author for the *term* "OOP", and *most* (not *all*) of the concepts (I
strongly disagree with your opinion that message-passing is not a core
concept of OO).

FWIW, I first mentionned Simula too (about the state-machine and
simulation aspect), then sniped this mention because I thought it was
getting a bit too much OT - we're not on comp.object here.

Jul 16 '07 #40

P: n/a
You are lucky.Our project is a cross-platform cluster computer
managment system
this system can run on both windows and Linux
http://pluster.gf.cs.hit.edu.cn/
I tell you how we solve this problems
>
1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
We use tk for GUI and we have a interpreter reads VB form file ".frm"
in and
display it with tk.You just need to draw forms in VB and save it in
frm formate
load it in your python file
LoadForm("aa.frm")
after that you can use button,menu and so on in python

We use XMLRPC to conmunicate with remote node.XMLRPC is very cool for
you can
invoke a function in remote side in the same way you invoke a local
method.
for example
we have an remote object foo
foo.bar() #invoke bar() in remote side
but XMLRPC is work on network.I'm not sure it can work in serial
2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.
and most important is XMLRPC is cross-platform.you can use a linux for
server and windows for client
Jul 16 '07 #41

P: n/a
On Mon, 16 Jul 2007 10:10:05 +0200, Bruno Desthuilliers
<br********************@wtf.websiteburo.oops.comwr ote:
>Wayne Brehaut a écrit :
(snip)
after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept."

Please reread more carefully the above. I do give credit to Smalltalk's
author for the *term* "OOP", and *most* (not *all*) of the concepts (I
strongly disagree with your opinion that message-passing is not a core
concept of OO).
One problem is that it's often not clear what lists of properties are
his definition of OOP vs. what are the intended properties of
Smalltalk--his intended impelmentation of OOP. Many of the lists begin
with the basic requirements that "everything is an object" and
"objects communicate by message passing", but the most common
"generally agreed upon" definition abstracts just four requirements
from these (changing) lists--attempting to separate implementation
details from what is essential to the underlying framework. As I note
below, these were:

1. modularity (class-based? object-based?)
2. inheritance (sub-classing)
3. encapsulation (information hiding)
4. polymorphism ((sub-) class-specific response to a message, or
processing of a method)

Other details in Kay's lists are considered implementation details,
and important advances or alternatives to pevious methods, but not
required for a language to _be_ OO. It is reputed, though, that in
2003 Kay said
(http://c2.com/cgi/wiki?AlanKaysDefin...ObjectOriented) "OOP to
me means only messaging, local retention and protection and hiding of
state-process, and extreme LateBinding of all things."

So I understand your accepting one of Kay's lists as being a
definition of OOP instead of "just" a description of Smalltalk, or of
accepting this fairly recent "definition" as being the true one (as
opposed to the previous lists of usually 6 properties). "It's hard to
hit a moving target!"
>FWIW, I first mentionned Simula too (about the state-machine and
simulation aspect), then sniped this mention because I thought it was
getting a bit too much OT - we're not on comp.object here.
Understood--sort of--but there is sufficient accurate information
about Simula available on the web now that it's no longer necessary to
use quotes from Kay about OOP and Smalltalk just because they're more
accessible, as used to be the case. What would be so OT about
referring to Simulain one sentence instead of or in addition to
Smalltalk?

But I digress--my only real objection to your post was your opinion
and claim that Kay "invented the term and most of the concept": I've
never seen anyone claim that anyone else invented the term, but for
the claim that he invented "most of the concept" we need only refer to
Nygaard's claim in "How Object-Oriented Programming Started" at
http://heim.ifi.uio.no/~kristen/FORS..._OO_start.html
that "Simula 67 introduced most of the key concepts of object-oriented
programming: both objects and classes, subclasses (usually referred to
as inheritance) and virtual procedures, combined with safe referencing
and mechanisms for bringing into a program collections of program
structures described under a common class heading (prefixed blocks)."

Combine this with the fact--as stated above by Bonnie MacBird (Alan
Kay's significant other)--that "Alan Kay has always publicly credited
Simula as the direct inspiration for Smalltalk, and... this
implication of taking credit for the first OOP language is not true,
it is a credit assigned to him by others, and one which he usually
rights when confronted with it." If he acknowledges this perhaps
others should too?

As has been noted before, it's often the fact that a cause becomes a
religion: true believers tend to take it over from the originator, and
this religiosity tends to blind them from the facts. Opinions and
rumours become facts, stories are invented, definitions are changed or
twisted, and another religion is born! Even those who don't belong to
the religion cpme to believe the oft-repreated stories, and then help
spread and perpetuate them. (Continuing in my original humorous vein I
was tempted to use terms like "religious zealots", "false gospel",
"propaganda", etc., but thought better of it in case I was again
misunderstood.)

Again, I disagree only with this one claim. You make significant
contributions to the group and to ellucidating Python and OOP to the
great unwashed: in contrast, all I've done so far is complain about
those who don't accept the correct (i.e., my) definition or use of
terms.

wwwayne
Jul 16 '07 #42

P: n/a
On Mon, 16 Jul 2007 09:55:35 +0200, Bruno Desthuilliers
<br********************@wtf.websiteburo.oops.comwr ote:
>Wayne Brehaut a écrit :
>On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
<bd*****************@free.quelquepart.frwrote:
>>Chris Carlen a écrit :
Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
>>>Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
adherents everywhere!

Not feeling concerned.

(snip pro-simula/anti-Xerox propaganda).
Or, more accurately, pro:

1. Nygaard & Dahl as the inventors of most of the concept of OOP
2. Simula as the first OOP
3. Kay as the originator of the term OOP
4. Kay, Xerox PARC, and Smalltalk as making significant useful
advances in implementation of OOP and "popularizing" it

and anti:

1. attributing credit for any accomplishment to someone who doesn't
himself claim it and even denies it

wwwayne o/o
Jul 16 '07 #43

P: n/a
Hendrik van Rooyen wrote:
"Chris Carlen" <crcarl,,,,dia.govwrote:
>>Form 2: Use Python and PySerial and TkInter or wxWidgets.
Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.
This is the way to go. - Trust me on this.
When you describe your history, it is almost an exact parallel to mine.
In my case, I have been doing real low level stuff (mostly 8031 assembler)
since 1982 or so. And then I found python in a GSM module (Telit), and
I was intrigued.
I really appreciate your comments on OO - it parallels a lot of what I feel
as there is a lot of apparent BS that does not seem to "do anything" at first
sight.
However- for the GUI stuff, there is an easily understood relationship between
the objects and what you see on the screen - so its a great way of getting
into OO - as far as people like you and me will go with it, which is not very
far, as we tend to think in machine instructions...
And for what its worth - you can programme assembler-like python, and it also
works.

The best thing to do is just to spend a few days playing with say Tkinter.
I use a reference from the web written by John W Shipman at New Mexico
Tech - it is succinct and clear, and deserves more widespread publicity.

Google for it - I have lost the link, although I still have the pdf file.
[edit]

Thanks for the tip. The next poster provides the link, which I've got
bookmarked now.

The more I play with Python, the more I like it. Perhaps I will
understand OOP quicker than I thought. What I've learned so far about
names binding to objects instead of values stored in memory cells, etc.
has been interesting and fascinating.

--
Good day!

________________________________________
Christopher R. Carlen
Principal Laser&Electronics Technologist
Sandia National Laboratories CA USA
cr***************@BOGUSsandia.gov
NOTE, delete texts: "RemoveThis" and
"BOGUS" from email address to reply.
Jul 16 '07 #44

P: n/a
Chris Carlen wrote:
Hendrik van Rooyen wrote:
> "Chris Carlen" <crcarl,,,,dia.govwrote:
>>Form 2: Use Python and PySerial and TkInter or wxWidgets.
Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.
This is the way to go. - Trust me on this.
When you describe your history, it is almost an exact parallel to mine.
In my case, I have been doing real low level stuff (mostly 8031 assembler)
since 1982 or so. And then I found python in a GSM module (Telit), and
I was intrigued.
I really appreciate your comments on OO - it parallels a lot of what I feel
as there is a lot of apparent BS that does not seem to "do anything" at first
sight.
However- for the GUI stuff, there is an easily understood relationship between
the objects and what you see on the screen - so its a great way of getting
into OO - as far as people like you and me will go with it, which is not very
far, as we tend to think in machine instructions...
And for what its worth - you can programme assembler-like python, and it also
works.

The best thing to do is just to spend a few days playing with say Tkinter.
I use a reference from the web written by John W Shipman at New Mexico
Tech - it is succinct and clear, and deserves more widespread publicity.

Google for it - I have lost the link, although I still have the pdf file.
[edit]

Thanks for the tip. The next poster provides the link, which I've got
bookmarked now.

The more I play with Python, the more I like it. Perhaps I will
understand OOP quicker than I thought. What I've learned so far about
names binding to objects instead of values stored in memory cells, etc.
has been interesting and fascinating.
I'm happy you are proceeding with so little trouble. Without wishing to
confuse you, however, I should point out that this aspect of Python has
very little to do with its object-orientation. There was a language
called Icon, for example, 20 years ago, that used similar semantics but
wasn't at all object-oriented.

regards
Steve
--
Steve Holden +1 571 484 6266 +1 800 494 3119
Holden Web LLC/Ltd http://www.holdenweb.com
Skype: holdenweb http://del.icio.us/steve.holden
--------------- Asciimercial ------------------
Get on the web: Blog, lens and tag the Internet
Many services currently offer free registration
----------- Thank You for Reading -------------

Jul 16 '07 #45

P: n/a
Wayne Brehaut a écrit :
On Mon, 16 Jul 2007 10:10:05 +0200, Bruno Desthuilliers
<br********************@wtf.websiteburo.oops.comwr ote:

>>Wayne Brehaut a écrit :
(snip)
>>>after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept."

Please reread more carefully the above. I do give credit to Smalltalk's
author for the *term* "OOP", and *most* (not *all*) of the concepts (I
strongly disagree with your opinion that message-passing is not a core
concept of OO).


One problem is that it's often not clear what lists of properties are
his definition of OOP vs. what are the intended properties of
Smalltalk--his intended impelmentation of OOP. Many of the lists begin
with the basic requirements that "everything is an object" and
"objects communicate by message passing", but the most common
"generally agreed upon" definition abstracts just four requirements
from these (changing) lists--attempting to separate implementation
details from what is essential to the underlying framework. As I note
below, these were:

1. modularity (class-based? object-based?)
2. inheritance (sub-classing)
3. encapsulation (information hiding)
I don't see information hiding and encapsulation as being the very same
thing. But anyway...
4. polymorphism ((sub-) class-specific response to a message, or
processing of a method)
subclassing - and even classes - are not necessary for polymorphism. I
guess you have a good enough knowledge of Python and/or some
prototype-based OOPL to know why !-)
>
Other details in Kay's lists are considered implementation details,
and important advances or alternatives to pevious methods, but not
required for a language to _be_ OO. It is reputed, though, that in
2003 Kay said
(http://c2.com/cgi/wiki?AlanKaysDefin...ObjectOriented) "OOP to
me means only messaging, local retention and protection and hiding of
state-process, and extreme LateBinding of all things."

So I understand your accepting one of Kay's lists as being a
definition of OOP instead of "just" a description of Smalltalk, or of
accepting this fairly recent "definition" as being the true one
Is there any "true one" ?-)
(as
opposed to the previous lists of usually 6 properties). "It's hard to
hit a moving target!"
Indeed.
>
>>FWIW, I first mentionned Simula too (about the state-machine and
simulation aspect), then sniped this mention because I thought it was
getting a bit too much OT - we're not on comp.object here.


Understood--sort of--but there is sufficient accurate information
about Simula available on the web now that it's no longer necessary to
use quotes from Kay about OOP and Smalltalk just because they're more
accessible, as used to be the case. What would be so OT about
referring to Simulain one sentence instead of or in addition to
Smalltalk?
What I mean is that I felt my answer to be already OT enough so I sniped
large parts of it. FWIW, I could have sniped the reference to Alan Kay
and kept the one to Simula, but then it would have require more rewrite
work.
But I digress--my only real objection to your post was your opinion
and claim that Kay "invented the term and most of the concept":
I agree that the term "most" is perhaps a bit too strong. For my
defense, please keep in mind that I'm not a native english speaker, so I
often have hard time expressing myself with the exact nuance I'd use in
french.

(snip)
>
As has been noted before, it's often the fact that a cause becomes a
religion:
Good Lord, save us from becoming religious !-)

Ok, I admit that I have my own understanding of OO (as anyone else, I
guess), which is quite closer to Smalltalk's model than to any other
OOPL (even Python). It probabaly has to do with the extremely
generalized and systematic application of two key concepts - objects and
messages - in such a way that it becomes a coherent whole - while most
mainstream OOPLs feel to me more like ad-hoc collection of arbitrary
rules and features. So yes, I'm probably guilty of being a bit too
impassioned here, and you're right to correct me. But have mercy and
take time to read a bit more of the offending post, I'm pretty confident
you won't find me guilty of mis-placed "religiosity".

(snip)
in contrast, all I've done so far is complain about
those who don't accept the correct (i.e., my) definition or use of
terms.
Lol ! I'm afraid this is something we're all guilty of one day or another...
Jul 16 '07 #46

P: n/a
Wayne Brehaut a écrit :
On Fri, 13 Jul 2007 20:37:04 -0400, Steve Holden <st***@holdenweb.com>
wrote:

>>Aahz wrote:
>>>In article <f7********@news4.newsguy.com>,
Chris Carlen <cr***************@BOGUSsandia.govwrote:
From what I've read of OOP, I don't get it.

For that matter, even using OOP a bit with C++ and Perl, I didn't get it
until I learned Python.
The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988.

Newbie. ;-)

(I started with BASIC in 1976.)

Newbie ;-)

(I started with Algol 60 in 1967).


Newbie ;-)

(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

My my my... Would you believe that my coworkers do consider me like an
old sage because I started programming in 1990 with HyperTalk on Mac
Classic !-)

I suddenly feel 20 again ! Woo !-)
Jul 16 '07 #47

P: n/a
Wayne Brehaut a écrit :
On Sat, 14 Jul 2007 19:18:05 +0530, "Rustom Mody"
<ru*********@gmail.comwrote:

>>On 7/14/07, Alex Martelli <al***@mac.comwrote:
>>>OOP can be abused (particularly with deep or intricate inheritance
structures). But the base concept is simple and clear: you can bundle
state and behavior into a stateful "black box" (of which you may make as
many instances, with independent state but equal behavior, as you need).

Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
of Object Orientation in which he called the 'base concept' of 'bundle
of state and behavior' as 'object based' programming and
'object-oriented' as object-based + inheritance.


Not quite--according to him:

object-based + classes =class-based
class-based + class inheritance =object-oriented

I.e., "object-oriented = objects + classes + inheritance".
What about prototype-based languages then ?-)
Jul 16 '07 #48

P: n/a
"Bruno Desthuilliers" <bdesth.qu....se@free.quelquepart.frwrote:
>My my my... Would you believe that my coworkers do consider me like an
old sage because I started programming in 1990 with HyperTalk on Mac
Classic !-)

I suddenly feel 20 again ! Woo !-)
*hands him a straw boater and a cane*

ok youngster - lets see you strut your stuff...

; - )


Jul 17 '07 #49

P: n/a
Steve Holden <st***@holdenweb.com>:
>I'm happy you are proceeding with so little trouble. Without wishing to
confuse you, however, I should point out that this aspect of Python has
very little to do with its object-orientation. There was a language
called Icon, for example, 20 years ago, that used similar semantics but
wasn't at all object-oriented.
Actually, there was a language called SNOBOL, 40 years ago, that used
similar semantics, developed by Griswold et al. Its object model was
remarkably similar to that of Python without classes. And it even had
dictionaries (called "tables") :-).

For an explaination of the concept "variable" in SNOBOL see
<http://www.cacs.louisiana.edu/~mgr/404/burks/language/snobol/catspaw/tutorial/ch1.htm#1.3>

SNOBOLs powerfull patterns still shine, compared to Pythons clumsy
regular expressions. I've used the language a lot in the past, first on
the mainframe (SPITBOL on System/360), later on the PC (Catspaws SNOBOL4
&SPITBOL). When I switched to Python, it wasn't because of the
expressiveness of the language, but of the rich library ("batteries
included") and the IMO elegant syntax, i.e. blocks by identation.

<http://en.wikipedia.org/wiki/SNOBOL>
<http://en.wikipedia.org/wiki/Ralph_E._Griswold>

Icon came later. Griswold developed Icon as a successor to SNOBOL,
constructing it around the concept of generators and co-expressions. I
didn't like it.
--
Thank you for observing all safety precautions
Jul 17 '07 #50

65 Replies

This discussion thread is closed

Replies have been disabled for this discussion.