I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).
- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .
Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.
As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.
Having followed this newsgroup for sometimes, I now have the gut feeling
(see 3)) other people have that feeling too.
Quid ?
Regards,
Philippe
Jul 19 '05
137 6900
Terry Hancock ha scritto: It's the reverse-translation from the French "Informatique".
Or maybe the italian Informatica...
--
Renato
--------------------------------
Usi Fedora? Fai un salto da noi: http://www.fedoraitalia.org Oh well, I guess it's a bit late to try to rename the Computer Science discipline now. The best I've heard is "Informatics" -- I have a vague impression that this is a more European name for the field.
The word "Informatics" had been invented by a Soviet computer scientist
Andrey Ershov several decades ago as a name for his view on
Information Theory. In Russian, it is Информатика,
Informatika, by analogy with Mathematics, Математика. It is
widely used ever since both in russian-language literature and in many
other languages and countries.
It is a better name than either Information Theory or Computer Science
for the discipline.
Terry Hancock wrote: Of course, since children are vastly better at learning than adults, perhaps adults are stupid to do this. ;-)
Take learning a language. I'm learning Swedish. I'll
never have a native accent and 6 year olds know more
of the language than I do. But I make much more
complicated sentences than 6 year olds. (Doesn't mean
they are grammatically correct, but I can get my point
across given a lot of time.)
Quantum mechanics notwithstanding, I'm not sure there is a "bottom" "most-reduced" level of understanding. It's certainly not clear that it is relevant to programming.
I agree. That's why I make this thread branch. I think
learning is often best taught from extending what you know
and not from some sort of top/bottom approach. I'm also
one who bristles at hierarchies. Maybe that's why I like
Python and duck typing. :)
Some learning works by throwing yourself in the deep end.
Languages are sometimes learned that way. The Suzuki method
extends that to music, though that's meant for kids.
Python is actually remarkably good at solving things in a nearly optimal way.
Have you read Richard Gabriel's "Worse is Better" essay? http://www.dreamsongs.com/WIB.html
Section "2.2.4 Totally Inappropriate Data Structures"
relates how knowing the data structure for Lisp affects
the performance and seems relevant to your point.
Andrew da***@dalkescientific.com
On Tue, 14 Jun 2005 12:49:27 +0200, Peter Maas <pe***@somewhere.com>
wrote: Depends if you wanna build or investigate. Learning is investigating.
Yeah, after thinking to this phrase I've to agree.
Sometimes learning is investigating, sometimes it's
building. Since I discovered programming I've spent
most time just building, but I started on a quite
concrete floor (assembler).
Sometimes, for example when I recently dedicated some
time to functional languages, it's more investigating.
Functional languages is not something I've used extensively,
I've no real experience with them... and at least at
the application level for me is pure magic.
I mean that I can understand how the language is
implemented itself, what is just amazing for me is how
can you build a program for a computer - an incredibly
complex state based machine - with a language where
state is not your guide but your enemy.
Don't nail me down on that stupid string, I know it's immutable but didn't think about it when answering your post. Take <some mutable replacement> instead.
Forgive me, I didn't resist :-)... it was such a juicy hit.
But this very fact shows that when programming
you cannot just skim over the details. You can only
avoid using the conscious mind to check the details
if you are so confident with them to leave important
checks (like actual praticality of a solution) to
your subconscius mind.
That strings in python are immutable it's surely
just a detail, and it's implementation specific,
but this doesn't means it's not something you can
ignore for a while. If you use python this is a
*fundamental* property.
That deleting the first element of a list in python
is a slow operation is also a detail and very
implementation specific, but ignore it and your
programs will be just horrible.
Often when I think to a problem I find myself saying
"oh... and for that I know I'll find a good solution"
without actually thinking one. I *know* I can solve
that problem decently because I've been around there.
I do not need to check because my experience tells me
I can first check where I want to go from there first
because getting in that direction is not going to be
a problem. Without this experience and conscious or
subconscious attention to details you'll find yourself
saying "oh... and we could go over there" pointing at
the sun and then blaming your "technicians" because
there are a few little "details" that you didn't
consider.
Andrea
Andrea Griffini wrote: That strings in python are immutable it's surely just a detail, and it's implementation specific, but this doesn't means it's not something you can ignore for a while. If you use python this is a *fundamental* property.
My communication ability is dropping every day at
an incredible rate (it's getting so bad I've been
seriously in doubt about physiological problems).
In the above phrase there is a "not" in excess; I
meant to write
... but this doesn't means it's something you can
ignore for a while. If you use python this ...
May indeed be I'm just *realizing* how low are my
communication skills...
Andrea
Andrea Griffini <ag****@tin.it> wrote: That strings in python are immutable it's surely just a detail, and it's implementation specific, but this doesn't means it's not something you can ignore for a while.
I disagree. It is indeed something you can ignore for a while. The first
program you teach somebody to write is going to be:
print "Hello, world"
To get that to work, you need to figure out how to run the python
interpreter, and how to edit a program source file. You also learn some
basic python syntax like how to form strings, that the print statement
gives you a carriage return for free, and you don't need a ";" at the end.
It would be a mistake to mention now that "Hello, world" is an immutable
object. That's just not important at this point in the learning process.
Eventually, you're going to have to introduce the concept of immutability.
That point may not be much beyond lesson 2 or so, but it doesn't have to be
lesson 1.
On Wed, 15 Jun 2005, Terry Hancock wrote: On Tuesday 14 June 2005 08:12 am, Magnus Lycka wrote:
Oh well, I guess it's a bit late to try to rename the Computer Science discipline now. Computer programming is a trade skill, not a science. It's like being a machinist or a carpenter --- a practical art.
A lot of universities teach 'software engineering'. I draw a distinction
between this and real computer science - computer science is the abstract,
mathematical stuff, where you learn to prove your programs correct,
whereas software engineering is more practical, where you just learn to
write programs that work. Of course, CS does have quite a practical
element, and SE has plenty of theory behind it, but they differ in
emphasis. SE departments tend to grow out of electronics engineering
departments; CS departments tend to bud off from maths departments.
Unfortunately, our society has a very denigrative view of craftsmen, and does not pay them well enough, so computer programmers have been motivated to attempt to elevate the profession by using the appellative of "science".
How different would the world be if we (more accurately) called it "Computer Arts"?
At one point, a friend and i founded a university to give our recreational
random hackery a bit more credibility (well, we called ourself a
university, anyway; it was mostly a joke). We called the programming
department 'Executable Poetry'.
tom
--
Punk's not sexual, it's just aggression.
> My communication ability is dropping every day at
Probably no reason to worry. Reading your post I haven't
even noticed the unnecessary "not", because the message
was clear as intended even with it, anyway.
Should I start to be seriously in doubt about own
physiological problems only because overseeing
it? I think no.
I think also, that many people here would confirm,
that your posts are on a very high communication level
showing much insight. May indeed be I'm just *realizing* how low are my communication skills...
That's another story - moving to the next level of insight
is usually a pain, because one detects how "stupid"
one was before.
Claudio
<ag****@tin.it> schrieb im Newsbeitrag
news:11*********************@z14g2000cwz.googlegro ups.com... Andrea Griffini wrote:
That strings in python are immutable it's surely just a detail, and it's implementation specific, but this doesn't means it's not something you can ignore for a while. If you use python this is a *fundamental* property.
My communication ability is dropping every day at an incredible rate (it's getting so bad I've been seriously in doubt about physiological problems). In the above phrase there is a "not" in excess; I meant to write
... but this doesn't means it's something you can ignore for a while. If you use python this ...
May indeed be I'm just *realizing* how low are my communication skills...
Andrea
Well as for the communication skills dropping. I highly doubt that, if
anything you are just picking up on things you never noticed before (and
your communication skills far surpass the average person that writes
anything in todays' society).
A good example for me is that I am noticing that I seem to type the like hte
often, yes a spell checker picks it up but when writing code it's sometimes
not available.
Also I think the fact that you think your were diteriating just goes to show
how dedicated you are to detail, and making sure you give the right advice
or ask the right question.
Jeff "Also I think the fact that you think your were diteriating just goes to show [...]"
should be probably:
"In my opinion the fact that you consider you were deteriorating just
shows [...]"
but it can be understood as it is anyway, right?
Written maybe exactly as it is, with the only purpose:
to encourage you, Andrea.
Claudio
"Jeffrey Maitland" <ma***@vianet.ca> schrieb im Newsbeitrag
news:ma**************************************@pyth on.org... Well as for the communication skills dropping. I highly doubt that, if anything you are just picking up on things you never noticed before (and your communication skills far surpass the average person that writes anything in todays' society).
A good example for me is that I am noticing that I seem to type the like
hte often, yes a spell checker picks it up but when writing code it's
sometimes not available.
Also I think the fact that you think your were diteriating just goes to
show how dedicated you are to detail, and making sure you give the right advice or ask the right question.
Jeff
On Thu, 16 Jun 2005 10:30:04 -0400, "Jeffrey Maitland"
<ma***@vianet.ca> wrote: Also I think the fact that you think your were diteriating just goes to show how dedicated you are to detail, and making sure you give the right advice or ask the right question.
[totally-OT]
Not really, unfortunately. I found not long ago that I used the
very same word eight times in two consecutive sentences, plus
similar words and words with similar endings. Re-reading that
phrase the day later it seemed like something got stuck in my
brain while I was writing that. Sure more or less the idea was
there, and IMO clear enough to be understood, but the form and
the choice of words seemed incredibly poor.
I was curious of this strange fact and I checked out other text
I was writing in that period. What really scared me is that
this word repeating seemed a quite evident problem. This *both*
in italian (my native language) and in english.
Googling for old posts I however found that years ago my
english was even worse than it is now... but this repetition
was not present (not that evident, that is).
Needless to say I spent some hours googling for informations
about this kind of word repetition problem :D
Anyway after some time things got better.
I always thought about our intellect being something "superior"
to this world made of fragile bones and stinking flesh.
However I realized that there's probably no real magic in
it... knowing there are pills to make you happy is sort of
shocking from a philosophical point of view :-)
If you'll see me walking around with an esoskeleton and an
happy face it will mean I tried the chemical approach ;)
(don't try to understand this phrase, either you know what I
mean - and you like dilbert strips - or it can't make sense).
Andrea
On Thu, 16 Jun 2005 07:36:18 -0400, Roy Smith <ro*@panix.com> wrote: Andrea Griffini <ag****@tin.it> wrote: That strings in python are immutable it's surely just a detail, and it's implementation specific, but this doesn't means it's not something you can ignore for a while. I disagree. It is indeed something you can ignore for a while. The first program you teach somebody to write is going to be:
print "Hello, world"
I mean that the fact that strings are immutable is
one key aspect that cannot be worked around.
Python is this way and in this very fact is different
from e.g. C++. The ripple effect that this very little
"detail" can have is not local. There are design
based on strings that just do not make sense in python
for this fact. It's not something you can "fix" later...
if you need mutability you must simply not use strings
for that (and this can have a serious impact on the
source code).
Of course there are programs in which that strings
are immutable or not is irrelevant. But if you don't
know what are the implications (e.g. how "is" works
for strings in python) and you still don't run into
problems it's just pure luck.
The normal reaction I observed is that when they find
a problem the result is a "python is buggy" idea.
It would be a mistake to mention now that "Hello, world" is an immutable object. That's just not important at this point in the learning process. Eventually, you're going to have to introduce the concept of immutability. That point may not be much beyond lesson 2 or so, but it doesn't have to be lesson 1.
I must agree *if* you're teaching python first.
I also *completely* agree if you're doing this just to
get the appetite.
What I don't agree is that starting from this level
and going up is a good approach (with lously placed
bricks you'll just not be able to hold the construction).
To be able to build you'll need to memorize without a
rationalization too many "details" that just do not
make sense if you start from an ideal python world.
I also must note that I, as a fourteen, found terribly
interesting the idea of programming a computer even
if the only things I could do were for example turning
on and off pixels (blocks?) on a screen with resolution
40x50. Probably nowdays unless you show them an antialiased
texture mapped 3D floating torus with their name and
face on it in live video they'll prefer exchanging
stupid messages with the mobile phone instead.
Andrea
Andrea Griffini: I also must note that I, as a fourteen, found terribly interesting the idea of programming a computer even if the only things I could do were for example turning on and off pixels (blocks?) on a screen with resolution 40x50. Probably nowdays unless you show them an antialiased texture mapped 3D floating torus with their name and face on it in live video they'll prefer exchanging stupid messages with the mobile phone instead.
Well, on one hand I think that even 20 years ago 99% of people
preferred talking
about football than programming; on the other hand, I think that even
now there
is a 1% of people extremely interested in turning on or off a pixel.
I don't think anything significant changed in the percentages.
Michele Simionato
> I always thought about our intellect being something "superior" to this world made of fragile bones and stinking flesh. However I realized that there's probably no real magic in it... knowing there are pills to make you happy is sort of shocking from a philosophical point of view :-)
Yes it is, but it doesn't mean, that this well known
insight has an effect on what people think about
themselves, the religion and the Universe.
For an example of what I try to say here, see the
"What Deep Blue showed was that chess is
not a game of true intelligence"
discussion thread in rec.games.chess.computer
and track what meaning people assign to
the concept of Artificial Intelligence since the term
was coined.
As long as a machine can't replicate itself and defend
its own existance, its intelligence will be questioned.
And even if such a machine can fight its enemies the
final answer to the question if "true intelligence" is
unique to humans can be only given in a fight, but
even then the evidence of existance of superior AI
can't be given, because only dead people are forced
to agree, but it doesn't matter to them anymore ...
Claudio
On 17 Jun 2005 01:25:29 -0700, "Michele Simionato"
<mi***************@gmail.com> wrote: I don't think anything significant changed in the percentages.
Then why starting from
print "Hello world"
that can't be explained (to say better it can't be
*really* understood) without introducing a huge
amount of magic and not from a simple 8 bit CPU
instead ? What are the pluses of the start-from-high-level
approach ? If one is to avoid bordeom I don't agree
as assembler is all but boring (when you start),
or at least this was what *I* experienced.
If it's about the time it will take to get a rotating
3d torus with live video on it I know for sure that most
of the programmers I know that started from high level
will probably *never* reach that point. Surely if
you start say from pull-down menus they'll be able to
do pull down menus. And IMO there are good chances
they'll stay there lifetime.
So is python the good first programming language ?
IMO not at all if you wanna become a programmer; it
hides too much and that hidden stuff will bite back
badly. Unless you know what is behind python it will
be almost impossible for you to remember and avoid
all the traps. Buf if you need to know what is behind
it then it's better to learn that stuff first, because
it's more concrete and simpler from a logical point
of view; the constructions are complex but (because)
the bricks are simpler.
But it probably all boils down to what is a programmer.
Is C++ a good first programming language ?
BWHAHAHAHAHAHAHAHA :D
But apparently some guru I greatly respect thinks so
(I'm not kidding, http://www.spellen.org/youcandoit/).
Andrea
I fail to see the relationship between your reply and my original
message.
I was complaining about the illusion that in the old time people were
more
interested in programming than now. Instead your reply is about low
level
languages being more suitable for beginners than high level languages.
I don't see the connection.
Michele Simionato
Andrea Griffini wrote: Is C++ a good first programming language ?
BWHAHAHAHAHAHAHAHA :D
But apparently some guru I greatly respect thinks so (I'm not kidding, http://www.spellen.org/youcandoit/).
With respect to the author, and an understanding that there is probably
much that didn't go into his self-description (add "about.htm" to the
above URL), it sounds as though he knows primarily, perhaps solely, C
and C++, and has done relatively little serious development since he
seems to have spent most of his time either teaching or writing (words,
not source code).
Does he even *know* any real high level languages such as Python?
And the fact that he's teaching C++ instead of just C seems to go
against your own theories anyway... (though I realize you weren't
necessarily putting him forth as a support for your position).
-Peter
>> there is a 1% of people extremely interested in turning on or off a pixel
I taught "adults" aged from 16 to 86 for some years
a course "Introduction to data processing", where I had
tried to teach the basics beginning with switching light
on and off. Having around twenty participants I
experienced from time to time one or two who found
it fascinating, so the 1% is in my eyes a good guess.
40x50. Probably nowdays unless you show them an antialiased texture mapped 3D floating torus with their name and face on it in live video they'll prefer exchanging stupid messages with the mobile phone instead.
The ability of making a video (I currently experience
a run towards "equipping" videos from camcorders
showing the involved teenager fighting using ordinary
sticks with StarWars laser sword effects) when equipped
with appropriate software tool is given now even to the
not gifted 99%. After the videos are done by the a little
bit smarter ones of the entire group, it doesn't whetting
the apetite for more programming skills - it creates
traffic on ICQ and Internet by exchanging the
videos and the opinions if they are cool or not.
If it's about the time it will take to get a rotating 3d torus with live video on it I know for sure that most of the programmers I know that started from high level will probably *never* reach that point.
Many consider such skills as not worth to achieve,
looking for a solution to eventually raising problems
in a better computer hardware and new software
tools in case of timing problems.
Generally it appears to me, that it is true that many of
current teenagers look for authorities not for own experience
(this is nothing new) and that they perceive the world around
them through the window of the Internet browser not through
the window of the room (this is what makes the difference
compared to past time). But the current world they experience
is so different from what it was twenty years ago, that it
is today sure possible to start on a very high level and
stay there all the life never beeing able to go down to
the details without having therefore serious disadvantages
as a programmer. I experienced beeing very surprised
myself, that it is even possible to be hired as a programmer
having an IQ below the level of around 80.
I am personally biased towards trying to understand
anything as deep as possible and in the past was quite
certain, that one can not achieve good results
without a deep insight into the underlying details.
I have now to admit, that I was just wrong. From my
overall experience I infer, that it is not only possible
but has sometimes even better chances for success,
because one is not overloaded with the ballast of deep
understanding which can not only be useful but also
hinder from fast progress.
Claudio
"Andrea Griffini" <ag****@tin.it> schrieb im Newsbeitrag
news:of********************************@4ax.com... On 17 Jun 2005 01:25:29 -0700, "Michele Simionato" <mi***************@gmail.com> wrote:
I don't think anything significant changed in the percentages.
Then why starting from
print "Hello world"
that can't be explained (to say better it can't be *really* understood) without introducing a huge amount of magic and not from a simple 8 bit CPU instead ? What are the pluses of the start-from-high-level approach ? If one is to avoid bordeom I don't agree as assembler is all but boring (when you start), or at least this was what *I* experienced.
If it's about the time it will take to get a rotating 3d torus with live video on it I know for sure that most of the programmers I know that started from high level will probably *never* reach that point. Surely if you start say from pull-down menus they'll be able to do pull down menus. And IMO there are good chances they'll stay there lifetime.
So is python the good first programming language ? IMO not at all if you wanna become a programmer; it hides too much and that hidden stuff will bite back badly. Unless you know what is behind python it will be almost impossible for you to remember and avoid all the traps. Buf if you need to know what is behind it then it's better to learn that stuff first, because it's more concrete and simpler from a logical point of view; the constructions are complex but (because) the bricks are simpler.
But it probably all boils down to what is a programmer.
Is C++ a good first programming language ?
BWHAHAHAHAHAHAHAHA :D
But apparently some guru I greatly respect thinks so (I'm not kidding, http://www.spellen.org/youcandoit/).
Andrea
Claudio Grondi: I am personally biased towards trying to understand anything as deep as possible and in the past was quite certain, that one can not achieve good results without a deep insight into the underlying details. I have now to admit, that I was just wrong. From my overall experience I infer, that it is not only possible but has sometimes even better chances for success, because one is not overloaded with the ballast of deep understanding which can not only be useful but also hinder from fast progress.
FWIW, this is also my experience.
Michele Simionato
Peter Hansen wrote: But apparently some guru I greatly respect thinks so (I'm not kidding, http://www.spellen.org/youcandoit/).
With respect to the author, and an understanding that there is probably much that didn't go into his self-description (add "about.htm" to the above URL), it sounds as though he knows primarily, perhaps solely, C and C++, and has done relatively little serious development since he seems to have spent most of his time either teaching or writing (words, not source code).
Does he even *know* any real high level languages such as Python?
So you say he "has done relatively little serious development" and that
he may not even know about Python. I didn't see any evidence from those
pages to draw either conclusion. In fact the 4th paragraph quite
contradicts them both.
On 17 Jun 2005 05:30:25 -0700, "Michele Simionato"
<mi***************@gmail.com> wrote: I fail to see the relationship between your reply and my original message. I was complaining about the illusion that in the old time people were more interested in programming than now. Instead your reply is about low level languages being more suitable for beginners than high level languages. I don't see the connection.
I've been told in the past that one reason for which is
good to start from high-level languages is that you
can do more with less. In other words I've been told
that showing a nice image and may be some music is
more interesting than just making a led blinking.
But if this is not the case (because just 1% is
interested in those things no matter what) then
why starting from high level first then ?
I would say (indeed I would *hope*) that 1% is a low
estimate, but probably I'm wrong as others with more
experience than me in teaching agree with you.
Having more experience than me in teaching programming
is a very easy shot... I never taught anyone excluding
myself. About the 1%, I've two brothers, and one of
them got hooked to programming before me... the other
never got interested in computers and now he's just a
basic (no macros) ms office user.
So in my case it was about 66%, and all started with
a programmable pocket RPN calculator ... but there were
no teachers involved; may be this is a big difference.
Andrea
On Fri, 17 Jun 2005 08:40:47 -0400, Peter Hansen <pe***@engcorp.com>
wrote: And the fact that he's teaching C++ instead of just C seems to go against your own theories anyway... (though I realize you weren't necessarily putting him forth as a support for your position).
He's strongly advocating of a starting from high-level;
comp.lang.c++.moderated is where I first posted on this issue.
While I think that python is not a good first language, C++
is probably the *worst* first language I can think to.
C++ has so many traps, asymmetries and ugly parts (many for
backward compatibility) that I would say that one should try
put aside logic when learning it and just read the facts; in
many aspect C++ is the way it is for historical reasons or
unexplicable incidents: IMO there's simply no way someone can
deduce those using logic no matter how smart s/he is.
C++ IMO must be learned by reading... thinking is pointless
and in a few places even dangerous.
Also, given the C/C++ philosophy of "the programmer always
knows perfectly what is doing", experimenting is basically
impossible; trial and error doesn't work because in C++
there is no error; you have undefined behaviour daemons
instead of runtime error angels. Add to the picture the
quality of compile time error messages from the primitive
template technology and even compile time errors often look
like riddles; if you forget a "const" you don't get "const
expected"... you get two screens full of insults pointing
you in the middle of a system header.
Thinking to some of the bad parts of it it's quite shocking
that C++ is good for anything, but indeed it does work; and
can be better than C. I think C++ can be a great tool
(if you understand how it works; i.e. if it has no magic
at all for you) or your worst nightmare (if you do not
understand how it works).
I think that using C++ as the first language for someone
learning programming is absurd. Francis thinks otherwise.
Andrea
Andrea Griffini <ag****@tin.it> wrote: Add to the picture the quality of [C++] compile time error messages from the primitive template technology and even compile time errors often look like riddles;
Yeah, but what they lack in quality, they make up for in quantity.
if you forget a "const" you don't get "const expected"... you get two screens full of insults pointing you in the middle of a system header.
Python and C++ complement each other quite nicely. For example, one
of the first things I did when I was learning C++ was to write a
Python program which parsed and re-formatted C++ compiler error
messages so they were easier to read :-)
On 17 Jun 2005 06:35:58 -0700, "Michele Simionato"
<mi***************@gmail.com> wrote: Claudio Grondi:
.... From my overall experience I infer, that it is not only possible but has sometimes even better chances for success, because one is not overloaded with the ballast of deep understanding which can not only be useful but also hinder from fast progress.
FWIW, this is also my experience.
Why hinder ?
Andrea
Andrea Griffini wrote: Why hinder ?
Suppose you have to accomplish a given task using a framework
which is unknown to you. The manual is 1000 pages long.
In order to get the job done, it is enough to study 50 pages
of it. There are people with the ability to figure out very
quickly which are the relevant 50 pages and ignore the other
950. Granted, these people will have a shallow knowledge with
respect to somebody studying the whole manual, but they
will get the job done much faster and in some circumstances
speed is more valuable than deep knowledge.
To be able to content himself with a shallow knowledge
is a useful skill ;)
Michele Simionato
On 17 Jun 2005 21:10:37 -0700, "Michele Simionato"
<mi***************@gmail.com> wrote: Andrea Griffini wrote: Why hinder ?
....To be able to content himself with a shallow knowledge is a useful skill ;)
Ah! ... I agree. Currently for example my knowledge
of Zope is pretty close to 0.00%, but I'm using it
and I'm happy with it. I did what I was asked to do
and took way less time than hand-writing the cgi stuff
required. Every single time I've to touch those scripts
I've to open the Zope book to get the correct method
names. But I'd never dare to call myself a zope
developer... with it I'm just at the "hello world"
stage even if I accomplished what would require a
lot of CGI expertise.
But once I remember running in a problem; there was
a file of about 80Mb uploaded in the Zope database
that I wasn't able to extract. I was simply helpless:
download always stopped arount 40Mb without any error
message. I wandered on IRC for a day finding only
other people that were better than me (that's easy)
but not good enough to help me.
In the end someone gave me the right suggestion, I
just installed a local zope on my pc, copied the
database file, extracted the file from the local
instance and, don't ask me why, it worked.
This very kind of problem solution (just try doing
stupid things without understanding until you get
something that looks like working) is what I hate
*MOST*. That's one reason for which I hate windows
installation/maintenance; it's not an exact science,
it's more like try and see what happens.
With programming that is something that IMO doesn't
pay in the long run.
I'm sure that someone that really knows Zope would
have been able to get that file out in a minute,
and may be doing exactly what I did.
But knowing why! And this is a big difference.
Indeed when talking about if learning "C" can hinder
or help learning "C++" I remember thinking that to
learn "C++" *superficially* learning "C" first is
surely pointless or can even hinder.
But to learn "C++" deeply (with all its quirks) I
think that learning "C" first helps.
So may be this better explain my position; if you wanna
become a "real" programmer, one that really has things
under control, then learning a simple assembler first
is the main path (ok, may be even a language like C
can be a reasonable start, but even in such a low-level
language there are already so many things that are
easier to understand if you really started from bytes).
However, to be able to do just useful stuff with a
computer you don't need to start that low; you can
start from python (or, why not, even dreamweaver).
Andrea
On 18 Jun 2005 00:26:04 -0700, "Michele Simionato"
<mi***************@gmail.com> wrote: Your position reminds me of this:
http://www.pbm.com/~lindahl/real.programmers.html
Yeah, but as I said I didn't use a TRS-80, but an
Apple ][. But the years were those ;-)
Andrea
D H wrote: Peter Hansen wrote: With respect to the author, and an understanding that there is probably much that didn't go into his self-description (add "about.htm" to the above URL), it sounds as though he knows primarily, perhaps solely, C and C++, and has done relatively little serious development since he seems to have spent most of his time either teaching or writing (words, not source code).
Does he even *know* any real high level languages such as Python?
So you say he "has done relatively little serious development" and that he may not even know about Python. I didn't see any evidence from those pages to draw either conclusion. In fact the 4th paragraph quite contradicts them both.
Clearly this is a matter of opinion. Now that you've expressed yours,
did you have a point to make besides that you like to contradict my
posts? Maybe you'd like to take the opportunity to mention Boo?
-Peter
Andrea Griffini wrote: Indeed when talking about if learning "C" can hinder or help learning "C++" I remember thinking that to learn "C++" *superficially* learning "C" first is surely pointless or can even hinder. But to learn "C++" deeply (with all its quirks) I think that learning "C" first helps.
I think you are mistakingly bringing order into the picture, when extent
is more likely the case. If you want to master C++, I think that most
would agree you need to understand C. But there are many who would
disagree that the path to C++ must *start* at C. (In fact, many people
argue that a lot of bad C++ is due to people programming C in C++.)
Instead they would argue that you should start by learning C++
"superficially", then learn C, and re-evaluate you C++ practices in
light of the lessons learned from C.
The example I'll pull out is natural languages - I understood the
grammar & construction of my native tounge *much* better after learning
a foreign language. From people I've talked to, this is a common
occurance. But there would be few people who would advocate that one
should learn a foreign language before learning one's native tounge.
Andrea Griffini <ag****@tin.it> writes: On Tue, 14 Jun 2005 16:40:42 -0500, Mike Meyer <mw*@mired.org> wrote:
Um, you didn't do the translation right. Whoops.
So you know assembler, no other possibility as it's such a complex language that unless someone already knows it (and in the specific architecture) what i wrote is pure line noise.
You studied it after python, I suppose.
Nope. I don't think I've learned any assemblers since I learned Python.
Of course, I'd been writing code for 20 years before I learned Python. or, even more concrete and like what I learned first
lda $300 clc adc $301 sta $302
is simpler to understand.
No, it isn't - because you have to worry about more details.
In assembler details are simply more explicit. Unfortunately with computers you just cannot avoid details, otherwise your programs will suck bad. When I wrote in an high level language or even a very high level one the details are understood even if I'm not writing down them. After a while a programmer will even be able to put them at a subconscius level and e.g. by just looking at O(N^2) code that could be easily rewritten as O(N) or O(1) a little bell will ring in you brain telling you "this is ugly". But you cannot know if something is O(1) or O(N) or O(N^2) unless you know some detail. If you don't like details then programming is just not the correct field.
I've never argued otherwise.
Think that "a = b + c" in computes the sum of two real numbers and your program will fail (expecting, how fool, that adding ten times 0.1 you get 1.0) and you'll spend some time wondering why the plane crashed... your code was "correct" after all.
Especially if b and c aren't floats. I've always used "real" as a
mathematical term, since they had it first. Computers don't deal with
reals. For instance, whitesmith had a z80 assembler that let you write:
a = b + c
and it would generate the proper instructions via direct translation.
To use that I've to understand what registers will be affected and how ugly (i.e. inefficient) the code could get. Programmin in assembler using such an high level feature without knowing those little details woul be just suicidal.
The assembler lets you specify which registers to use. You either name
them in place of variables, or variables that are labels for the
registers. But saying for example that
del v[0]
just "removes the first element from v" you will end up with programs that do that in a stupid way, actually you can easily get unusable programs, and programmers that go around saying "python is slow" for that reason.
That's an implementation detail. It's true in Python, but isn't necessarily true in other languages.
Yeah. And you must know which is which. Otherwise you'll write programs that just do not give the expected result (because the user killed them earlier).
Actually, it isn't always true in Python. What if v is a dictionary (in
which case the description is wrong), or a class that maps an SQL table's
row id's to objects holding the data for that row? In either case, the
statement will be O(1).
You do need to know which is which. Yes, good programmers need to know that information - or, as I said before, they need to know that they need to know that information, and where to get it.
I think that a *decent* programmer must understand if the code being written is roughly O(n) or O(n^2). Without at least that the possibility of writing useful code, excluding may be toy projects, is a flat zero. Looking that information later may be just "too" late, because the wrong data structure has already been used and nothing can be done (except rewriting everything).
I don't think those two statements contradict each other. A decent
programmer will know the O() of the code they write - or where to
find that information. And they'll check it beforehand.
The advantage of using an HLL is that rewriting everything to try
other data structures (after all, the constants that O() notation
ignore matter as well, so that the fastest O() notation may not be
the fastest solution for the problem in hand). That may well be true of the standard C++ library - I don't write it. But it certainly doesn't appear to be true of, for instance, Python internals. I've never seen someone explain why, for instance, string addition is O(n^2) beyond the very abstract "it creates a new string with each addition". No concrete details at all. The problem is that unless you really internalized what that means you'll forget about it. Don't ask me why, but it happens. Our mind works that way. You just cannot live with a jillion of unrelated details you cannot place in a scheme. It doesn't work. One would do thousand times the effort that would be done using instead a model able to justify those details.
Again, you're generalizing from "your mind" to "everyone's mind". My
experience indicates that's not true for me. For instance, I find that
learning a typical assembler involves learning a jillion unrelated
details - because it's not at all uncommon for the opcode mnemonics to
be seemingly random strings of characters. Or random words.
Architectures with irregular register usages seem to have little rhyme
or reason behind those irregularities (though I did avoid those
architectures, so may have missed the reason(s) behind some of them).
Even on architectures with symmetric register usage, register usage
conventions are pretty much arbitrary. The problem with designing top down is that when building (for example applications) there is no top.
This is simply false. The top of an application is the application-level object
Except that the marketing will continuosly shift what you application is supposed to do. And this is good, and essential. This is "building". Sometimes marketing will change specifications *before* you complete the very first prototype. For complex enough projects this is more the rule than the exception. In the nice "the pragmatic programmer" book (IIRC) is told that there's no known complex project in which specification was changed less than four times before the first release... and the only time they were changed just three times it was when the guy running with the fourth variations was hit by a lightning on the street.
Except that those specification changes rarely change the top-level
object/method/whatever. At least, all the ones I dealt with wound
up changing things that were in the middle of the design. The easiest
ones were the ones that were anticipated, and so the changes were all
in data, and not in code. Unfortunately sometimes there is the OPPOSITE problem... we infer general rules that do not apply from just too few observations.
Your opposite problem is avoided by not teaching the details until they are needed, and making sure you teach that those are implementation details, so they student knows not to draw such conclusions from them.
What you will obtain is that people that will build wrong models. Omitting details, if they can really affect the result, is not a good idea.
Well, the "result" largely depends on why the project is being built.
If you're doing exploratory programming, the "result" is a better
understanding of the objects in the problem domain. The details that
affect that result are radically different from the details that affect
the result if you're building a production application, which is again
different from the details that affect the result if you're teaching
people how to program.
Again, the critical thing is teaching students what details matter, and
which ones don't.The critical things a good programmer knows about those concrete details is which ones are platform specific and which aren't, and how to go about learning those details when they go to a new platform.
I never observed this problem. You really did ?
As mentioned, you see it all the time in c.l.python. People come from other languages, and try to write Python as if the rules for that other language apply.
That's exactly because they don't know the details of any of the languages you used. Someone knowing the details would be curious to know *how* "del v[0]" is implemented in python. Actually it could be changed easily in an O(1) operation with just a little slowdown in element access (still O(1) but with a bigger constant). This is a compromise that has not been accepted and this very fact is important to know if you plan to use python seriously.
Actually, you don't need to know *anything* about the compromise if you
plan on using python seriously. You do need to know that "del v[0]" on
list is O(n). It can be fixed from the start by teaching the student the difference between abstract programming concepts and implementation details.
Sorry, but I really don't agree that big O is a "detail" that could be ignored. Only bubble-and-arrow powerpoint gurus could think that; I'm not in that crew. Ignore those little details and your program will be just as good as ones that don't even compile.
I've never argued that you should treat O() behavior as a detail that
can be ignored. I've argued that it's an implementation detail. As
such, you worry about it when you do the implmentation. If you need to
delete from both ends of an ordered set of objects, you can't use a
python list and get reasonable performance. It tackled abstract problems like "sorting". The students I'm talking about never dealt with anything that abstract. Sorting is abstract ?
Yeah. Remember, I'm talking about m.e., chem.e, etc. engineering students
here. Not software engineers or any other type of cs types. Pairing this with that teaching abelian groups first to kids (why not fiber spaces then ?) and that TAOCP is too "abstract" tells me that apparently you're someone that likes to talk just for talking, or that your religion doesn't allow you to type in smileys.
Now you're resorting to straw men and name-calling. That's an indication that you no longer have any real points.
I'll blame my bad english for understanding that you
If you wish. But since you posted your list of misconceptions about
what I said, I'm going to correct them.
said that abelian groups should be taught before relative numbers (somehow I crazily thought the point of discussion was what's the correct order of learning how to program),
Again, I never said that. I said *I* understood them better than
relative numbers, because *you* asked whether or not I did. That
says *nothing* about how I think they should be taught. I'm not so
egotistical as to think that every body thinks they same way I do.
that TAOCP is too abstract (a book where every single code listing is in assembler!)
I said it was too abstract for a specific group - one that deals
with concrete problems. In FORTRAN, usually.
and that big-o when programming is a detail that can be safely ignored (good luck, IMO you'll need hell a lot of it).
No, I said it was an implementation detail. I've maintained all along
that good programmers need to know those details.
<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Andrew Dalke <da***@dalkescientific.com> writes: Andrea Griffini wrote: Wow... I always get surprises from physics. For example I thought that no one could drop confutability requirement for a theory in an experimental science...
Some physicists (often mathematical physicists) propose alternate worlds because the math is interesting.
Mathematicians, on the other hand, tried to demonstrate that their
alternate worlds couldn't exist - and found the math in their failures
interesting. Hence we get non-euclidean geometries and other
interesting things - that physicists find useful. (To be fair, some
of the alternate mathematical world were first explored by physicists).
<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
"Claudio Grondi" <cl************@freenet.de> writes: What has it all to do with Python? To be not fully off-topic, I suggest here, that it is much easier to discuss programming related matters (especially in case of Python :-) or mathematics than any other subjects related to nature, because programming is _so easy_ compared to what is going on in the "real world". I see the reason for that in the fact, that programming is based on ideas and rules developed by humans themselves, so it is relatively easy to test and proove if statements are right or not.
As a mathematician, I have to say "ugh". Not all statements are easy
to test and prove. In fact, in any non-trivial mathematical system,
there will be statements that *cannot* be proven to be either true
or false. Some of those statements are interesting. The legends of
mathematics are problems that aren't easy to test and prove: fermant's
last theorem, the four color map theorem, and so on. Check out <URL: http://mathworld.wolfram.com/UnsolvedProblems.html > for a longer list.
It's not clear that the ideas/rules were "developed" by humans. I'd
say "discovered". In some cases in the past, mathematicians unhappy
about some rule set out to show that it must be true (or false). In
failing to show that, they invented a new branch of mathematics.
I'd say programming is more like that. But I approach programming from
a mathematicians viewpoint.
<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
On Mon, 13 Jun 2005 20:27:46 -0400, rumours say that Roy Smith
<ro*@panix.com> might have written: Andrea Griffini <ag****@tin.it> wrote: Hehehe... a large python string is a nice idea for modelling memory.
Actually, a Python string is only good for modelling ROM. If you want to model read-write memory, you need a Python list.
This is a misquote, since Andrea paraphrased what Peter Maas said. It
was Peter that suggested string usage to model memory (and obviously
forgot momentarily about string immutability in python).
If you included (even better, read :) the rest of Andrea's paragraph, it
would be obvious that you actually agree with Andrea.
--
TZOTZIOY, I speak England very best.
"Dear Paul,
please stop spamming us."
The Corinthians
On Thu, 16 Jun 2005 14:29:49 +0100, rumours say that Tom Anderson
<tw**@urchin.earth.li> might have written: At one point, a friend and i founded a university to give our recreational random hackery a bit more credibility (well, we called ourself a university, anyway; it was mostly a joke). We called the programming department 'Executable Poetry'.
That's a good idea for a t-shirt:
"Python: executable poetry"
(kudos to Steve Holden for ma**************************************@python.or g where the term PIPO
(Poetry In, Poetry Out) could be born)
and then, apart from t-shirts, the PSF could sell Python-branded
shampoos named "poetry in lotion" etc.
--
TZOTZIOY, I speak England very best.
"Dear Paul,
please stop spamming us."
The Corinthians
Christos TZOTZIOY Georgiou wrote: and then, apart from t-shirts, the PSF could sell Python-branded shampoos named "poetry in lotion" etc.
Which will once and for all solve the dandruffs problem prevalent among the
snake community these days.
Not funny? know then that German has one term for both 'dandruff' and
'scale' (Schuppe).
Still not funny? at least you have learned some German.
Peter
On Tue, 28 Jun 2005 15:46:01 +0300, rumours say that Christos "TZOTZIOY"
Georgiou <tz**@sil-tec.gr> might have written: (kudos to Steve Holden for ma**************************************@python.o rg where the term PIPO (Poetry In, Poetry Out) could be born)
oops! kudos to Michael Spencer (I never saw Michael's message on my
newsserver, so I archived Steve's).
--
TZOTZIOY, I speak England very best.
"Dear Paul,
please stop spamming us."
The Corinthians
Peter Otten wrote: Christos TZOTZIOY Georgiou wrote:
and then, apart from t-shirts, the PSF could sell Python-branded shampoos named "poetry in lotion" etc.
Which will once and for all solve the dandruffs problem prevalent among the snake community these days.
And once again the Pythonistas will be known as snake-oil salesmen. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: Brandon J. Van Every |
last post by:
What's better about Ruby than Python? I'm sure there's something. What is
it?
This is not a troll. I'm language shopping and I want people's answers. I
don't know beans about Ruby or have...
|
by: Brandon J. Van Every |
last post by:
I'm realizing I didn't frame my question well.
What's ***TOTALLY COMPELLING*** about Ruby over Python? What makes you jump
up in your chair and scream "Wow! Ruby has *that*? That is SO...
|
by: Chris Cioffi |
last post by:
I started writing this list because I wanted to have definite points
to base a comparison on and as the starting point of writing something
myself. After looking around, I think it would be a...
|
by: Jonathan Fine |
last post by:
Giudo has suggested adding optional static typing to Python.
(I hope suggested is the correct word.)
http://www.artima.com/weblogs/viewpost.jsp?thread=85551
An example of the syntax he proposes...
|
by: Reed L. O'Brien |
last post by:
I see rotor was removed for 2.4 and the docs say use an AES module
provided separately... Is there a standard module that works alike or
an AES module that works alike but with better encryption?...
|
by: MrBlueSky |
last post by:
Hello! I've just finished working on my first Python app (a
Tkinter-based program that displays the content of our application log
files in graphical format). It was a great experience that's...
|
by: emrahayanoglu |
last post by:
Hello Everyone,
Now, I'm working on a new web framework. I tried many test on the other
programming languages. Then i decided to use python on my web framework
project.
Now i want to listen...
|
by: Charles Arthur |
last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
|
by: ryjfgjl |
last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
|
by: emmanuelkatto |
last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud.
Please let me know.
Thanks!
Emmanuel
|
by: BarryA |
last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
|
by: Hystou |
last post by:
There are some requirements for setting up RAID:
1. The motherboard and BIOS support RAID configuration.
2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
|
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers,...
|
by: Hystou |
last post by:
Overview:
Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
| |