469,268 Members | 920 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,268 developers. It's quick & easy.

What's better about Ruby than Python?

What's better about Ruby than Python? I'm sure there's something. What is
it?

This is not a troll. I'm language shopping and I want people's answers. I
don't know beans about Ruby or have any preconceived ideas about it. I have
noticed, however, that every programmer I talk to who's aware of Python is
also talking about Ruby. So it seems that Ruby has the potential to compete
with and displace Python. I'm curious on what basis it might do so.

--
Cheers, www.3DProgrammer.com
Brandon Van Every Seattle, WA

20% of the world is real.
80% is gobbledygook we make up inside our own heads.

Jul 18 '05
220 16421
Roy Smith wrote:
DEBUG (cout << "I'm a debug statement\n";);

It looks ugly, but at least it indents correctly.


Why not simply

DEBUG(cout << "I'm a debug statement\n");

This should work unless you DEBUG does something strange.

Jul 18 '05 #101
Alexander Schmolck <a.********@gmx.net> writes:
Harry George <ha************@boeing.com> writes:
In the Lisp world, you use the hundreds of macros in CL becuase they
*are* the language. But home-grown (or vendor supplied) macros are
basically a lockin mechanism. New syntax, new behavior to learn, and
very little improvement in readability or efficiency of expresison
(the commmon rationales for macros).
How can you claim with a straight face that the sophisticated object, logic
programming, constraint programming, lazy evaluation etc systems people have
developed in scheme and CL over the years have brought "very little
improvement in readability or efficiency"?


When I want logic programming I go to Prolog, and I bind to/from
prolog with python. If I want lazy evaluation, I do it in python (see
e.g., xoltar). My concern is based primarily on experience with the
ICAD language where massive use of macros provide lazy evaluation at
the expense of an utterlay different language. We are finding the
same KBE work can often be done cleaner and simpler in Python.

The issue is not "can I do it at all". Lisp is great for that. It is
rather "do I need a wholly new syntax".
The python language is just fine as is.


No it isn't. Like every other language I know python sucks in a variety of
ways (only on the whole, much less so), but I don't claim I know how to fix
this with a macro system. I'm just not sure I buy Alex's argument that an
introduction of something equivalent in expressive power to say CL's macro
system would immediately wreck the language.

The trick with adding expressiveness is doing it in a manner that doesn't
invite abuse. Python is doing pretty well in this department so far; I think
it is easily more expressive than Java, C++ and Perl and still causes less
headache (Perl comes closest, but at the price of causing even greater
headache than C++, if that's possible).


That's the point: Lisp macros invite abuse. They are wonderfully
powerful and expressive. And they therefore support invention of new
worlds which must be learned by others. Python (so far) resists the
"creeping featurism", yet is still valuable in a very wide array of
situations.

To make an analogy with antural languages: English is relatively
successful not just through economic dominance but also through paring
away nuances of grammar. Yes, there are times and places where French
or Sanskrit or Morse code are more potent languages, but for a large
set of communications problems, English works quite well.

(If you are worried I'm a language chauvinist, see:
http://www.seanet.com/~hgg9140/languages/index.html )
If you really, really need something like a macro, consider a template body
which is filled in and exec'd or eval'd at run time.
I've actually written a library where most of the code is generated like this
(and it works fine, because only trivial code transformation are needed that
can be easily accomodated by simple templating (no parsing/syntax tree
manipulations necessary)).

But show me how to write something like CL's series package that way (or
better yet, something similar for transforming array and matrix manipulations
from some reader-friendly representation into something efficient).


Why reimplement the series package? That is a good example of rampant
CL overkill. In Steele's CLTL2, it takes 33 pages to explain. It is
great for people who are in the language day in and day out, and can
therefore keep the whole shebang in their mental working set. For
anyone who has other committments (e.g., me and 30 other engineers I
work with), the nuances of series are too complex for practical use.
In code reviews we have to bring out CLTL2 whenever someone uses any
of the fancy macros. You can get the same functionality with python
"for" or "while" and a few idioms.

As for array and matrix manipulation, I want a good C-based library
with python binding (e.g,, gsl), at times helped by some syntactic
sugar (Numeric). What I don't need is a brand new language for matrix
manipulation (wasn't APL for that?). If you mean a human readable
treatment that can be converted to those libraries, I'd have to point
to MathML. If you mean the programming syntax itself looks like
vector math, I'd use Numeric overloads up to a point, but beyond that
people get confused and you (I at least) need explicitly named
functions anyway.

'as


I'll concede that the macro issue is a personal taste sort of thing.
if you live inside a single mental world, you can afford to grow and
use fancy macros. If (like me) your day includes a dog's bvreakfast
of tasks, then the overhead is too great for the payoff.

--
ha************@boeing.com
6-6M31 Knowledge Management
Phone: (425) 342-5601
Jul 18 '05 #102
Roy Smith <ro*@panix.com> writes:
Here's a real-life example of how macros change the language. Some C++
code that I'm working with now, has a debug macro that is used like this:

DEBUG (cout << "I'm a debug statement\n";) The problem is, emacs doesn't know what to do with the code above.


If you think that's a problem ... I have recently seen some code where
a DEBUG macro trampled all over an enum containing a DEBUG value. I'm
not too worried about Emacs not knowing what to do with the code, but
I get rather more upset when the compiler doesn't know what to do with
the code :-)

But this is all slightly off-topic. I don't think this part of the
thread is about C-style macros (which are fundamentally flawed and
useless <0.0001 wink>, and very boring), but Lisp-style macros, which
are are to C-style macros what Emacs is to cat.
Jul 18 '05 #103
Alex Martelli <al*****@yahoo.com> writes:
Python's firmly in the "always half-open" field (pioneered in print,
to the best of my knowledge, in A. Koenig's "C Traps and Pitfalls",


I seem to recall seeing a scanned-in copy of a hand-written talk by
Dijkstra, on this.

[I suspect it predates Andrew's oeuvre ... but then probably doesn't
qualify on the "in print" requirement.]

Anyone know what I'm talking about ?
Jul 18 '05 #104
Alex Martelli <al*****@yahoo.com> writes:
John J. Lee wrote:
...
I'd never noticed that. Greg Ewing has pointed out a similar trivial
wart: brackets and backslashes to get multiple-line statements are
superfluous in Python -- you could just as well have had:

for thing in things:
some_incredibly_long_name_so_that_i_can_reach_the_ end_of_this =
line

where the indentation of 'line' indicates line continuation.
I see somebody else already indicated why this isn't so.


Andrew Dalke? I just read that, and didn't see any contradiction of
Greg's idea, just a discussion of it. Or did you just mean 'it isn't
a wart'?

[...]
You may be right there. Guido's comment that tuples are mostly
mini-objects did clarify this for me, though.


Oh, I'm quite "clear" on the issue,


Didn't mean to imply otherwise.

[...]
In the end, though, don't you agree that any such judgement is, from
the pragmatic point of view, pointless once you realise how similar
the two languages are?


No, I entirely disagree. I do have to choose one language or the
other for each given project; I do have to choose which language
to recommend to somebody wanting to learn a new one; etc, etc.


Yes. My criterion is then simply: "Which language is more popular?"
rather than "Which is marginally better?". Well, strictly, it's
"which has better free library code, ng support, etc.", but that's
reasonably well-correlated with popularity (unless you're Japanese, in
this case, perhaps).

[...] Non-linguistic considerations such as the above may also have their
weight, in some case. But they're not huge ones, either.


I had the impression that the amount of good library code out there
for Ruby was small, which I view as more important than the language
features which have been discussed here (with the possible exception
of this retroactive builtin class modification business, if people do
use it -- still seems incredible, but from Andrew's post it seems
you're right to be repulsed by this). But maybe the Python <--> Ruby
bridge is good enough that that (library code) isn't such a problem.

about undistinguishable to anybody and interchangeable in any


*in*distinguishable. Ha! I found a bug.


You're wrong, see e.g. http://dict.die.net/undistinguishable/ :
the spellings with the initial "u" and "i" are just synonyms.


:-( Google reports > factor of 10 fewer hits for it than 'in', and
it's not in my little dictionary. I wonder if it's in the OED...

[...]
I mostly agree, but I think you could be accused of spreading FUD
about this. Does anybody *really* choose to alter string-comparison
semantics under everybody else's nose in Ruby?? That would be like


As I see others indicated in responses to you, this is highlighted
and recommended in introductory texts. So why shouldn't many users
apply such "big guns"?

[...]

That is indeed strange.
John
Jul 18 '05 #105
The anaconda was much bigger, louder and enormous than anything
the snakes ever experienced in their humble but peaceful abode.

The anaconda saw his chance to make it big in the snake community.
It began hissing violently in all directions.

....And the snakes went back to their burrows leaving the
anaconda to regurgitate its undigested food...

Waiting for the hissing to die...

-Anand
Cliff Wells <cl************@comcast.net> wrote in message news:<ma**********************************@python. org>...
On Tue, 2003-08-19 at 02:03, Brandon J. Van Every wrote:
Cliff Wells wrote:
To try to force those battles is about as reliable an indicator
of trolling as is "URGENT: YOUR IMMEDIATE ASSISTANCE REQUIRED" is a
reliable indicator of Nigerian spam. It should come as no surprise to
you that people mistake your posts as trolling. They may be
incorrect, but it is certainly a reasonable assumption.


I hate to use an Eep-ism, but it's time for people to evolve. "You sound
like a troll" is not that reasonable an assumption. Outside of politics
newsgroups, I've almost never seen actual trolls.


I've seen several, mostly on the Linux lists, but c.l.py hasn't been
immune (lumberjack, you there?).
I've seen plenty of
people insulting each other, totally unable to see the other person's point
of view. I've seen countless instances of Troll Hunting all over Usenet
that had nothing to do with me. In fact, it's clearly a popular newsgroup
sport! People are intellectually lazy, they don't have an ironclad
criterion for what is or isn't a troll. A troll is anything they don't
like. They don't use imagination or exercise benefit of the doubt when
controversial posts appear. They just enjoy naming things within a familiar
pigeonholeing system. It comforts them. They are so focused on controlling
group discourse, and not letting "the wrong kind of conversation" happen,
that they don't put much thought into what was said in the first place.


Have you ever considered that perhaps it is your own expectation of a
"Troll Hunt" that makes you see such a thing when there is at least a
remote chance you're looking for something that isn't there? In my two
years on c.l.py I've never seen any such thing nor heard anyone else
complain of such until now. If you give up the fantasy that people are
out to get you then perhaps you might see how your own actions
precipitated today's events. It's certainly true that people were quick
to label your thread on Python vs. Ruby as a troll. However I also
expect much of that was due the lingering effects of your prior thread
on Python vs C#.
Were you known to be a contributor, either with code or
knowledge, your forays into time-wasting speculation would most likely
be much better received. As it is the only "contribution" you've made
is a generous use of other people's time and bandwidth.


As I see it, my detractors are enginerds with no appreciation for or skill
at marketing.


Quite probably true. Also most of them are probably proud of it. This
isn't a marketing group, it's a programming language discussion group.
I'd venture that with few exceptions most people here probably despise
marketing or at best find it irrelevant. That is why there is a
separate ng for discussion of such things. Most people here don't want
to hear about it, even in the sense that it relates to Python.
Nor, probably, much awareness of Python marketing efforts.


But the ones that do have hopefully found the appropriate forum.
That's how I figure you can (1) blithely ignore my analysis of Reality On
The Ground, (2) merrily claim I'm a do-nothing. I've suggested you look at
the marketing-python archives several times, it's a comment I've peppered
here and there. Have you?


I browsed it and in fact considered responding to some of your claims
there, which I found to be baseless (or at least misguided), but as I am
one of the people who find marketing boring I didn't bother.
The fact that you get "kills" in every ng you spend time in probably
says more about you than other people.


Actually, it says more about where I am in life. There was a time when I
never killfiled anybody. If I was rude, and we got into fights, we'd just
keep fighting. Often we'd kiss and make up. But sometimes, I'd get an
intractable, unforgiving, obsessive compulsive fixated asshole on my ass for
years. It drove me out of comp.games.development.* for a time, the
hierarchy I created. I got fed up with being told that I was a troll,
seeing as how I'd been Newsgroup Proponent. What the hell is that, the
ultimate trolling? I've never trolled in my life. Abrasive, argumentative,
violent, sure. Trolling, never.


Fair enough. But would you find it odd if you were to slip on over to
alt.culture.black and mention that you think affirmative action was a
mistake and people got riled up? That some of them would label you
racist? Of course not. In fact, if that were your position, you would
expect such a response and preemptively address that response before the
flamewar began. You should try the same thing with less volatile
topics. Instead of assuming that your point of view is obvious (it
isn't) or that people have read your posts in other groups (they
haven't), simply provide a supporting argument for such remarks. Trust
me, it goes a long way.
Anyways, a few years later I came back. I've been summarily killfiling
people ever since. It does wonders for my long-term interaction with people
and groups. I don't have to deal with the people I can't work with.


That actually doesn't sound confrontational. In fact, IIWAP, I'd
probably label it passive aggressive.
Personally I've never
killfiled anyone (not even the ruebot[!]) and to my knowledge, I've
yet to be
killfiled (although at times I'm a bit surprised at that, today being
one of those times).


It's really very simple. Your behavior is not the same as other people's
behavior. That's why we're still talking.
Why then, are you surprised when people choose to fight with you?


Why do you say I'm surprised? This is all very predictable to me. Seen at
this moment in time, you are a static type within this system. You just
happen to be a benevolent, workable type. You are the "Well, are you going
to killfile *me* too?" type.


I'm just the type who finds the entire process rather boring. I enjoy a
good argument (heated or otherwise) a bit more than the next guy.
However, I find irrational behavior a big yawn. I see no point in
investing a lot of work trying to make someone look foolish who is going
to do the job on his own anyway. I respect people who make their
knowledge and work available to others without belittling them. I
respect people who are able to articulate their arguments whether I
agree with them or not. In general, I'm willing to give people the
benefit of doubt which is why I've never killfiled anyone.
The kind of person who doesn't behave in a way
that I would killfile, and who doesn't quite understand how his behavior is
different from the people I do killfile. But, who probably will understand
by the time I get done explaining it.


Actually, what surprises me is that several of the people you killfiled
are some of the more reasonable people on the list. I'm a little
concerned that I wasn't among them <wink>
Have you seen Enter The Matrix? The scene with The Architect is very
profound. The Matrix demands Neo. Someone else observed, this stuff is all
about young males. That's no accident, it's a feature of our genetics. Our
genetics demand certain confrontations at a certain age. In our lives, we
obey certain Grand Strategies, even if we are sapient and exercising choice
over the details.
Taking your statements together
would seem to indicate that a fight is what you want.


That's sorta true. This is really what I call a "posting binge," to avoid
other things I don't want to be doing.


So for entertainment value you stir up groups where people are trying to
help others get their jobs done? That might be a bit of a cheap shot,
as I doubt you consciously intended to initiate a wild flamewar, but
seriously, your binge took valuable time from people who others depend
on. When I post a question to c.l.py, it's usually because I've got
something I need done now and don't have a ton of time to tinker around
or RTFM.
I'm an intellectually violent
person, I like a good debating scrap. But I've got Rules Of Engagement
about what I think is appropriate, intellectually defensible debate.
"You're a troll!" instead of hearing someone else's opinion ain't it.


Ah, you've done the same, albeit in a different fashion. "You're a
troll" certainly doesn't constitute a grand argument. But your approach
is to selectively respond to arguments and sprinkle in the odd flamebait
(or, as you would label them: things people don't want to hear). This
is why I had you pegged (momentarily) as a clever troll. It appeared as
if you were using just enough reasonable debate to disguise the
unreasonable assertion.
Those
people, I want to punish. It's irrational, because intellectually I know I
can't actually change them. Rationally, I can killfile them. Possibly, I
can stimulate cumulative change in the system, but I am uncertain of it. In
10 years maybe I will have let it go.


This sounds good on the surface, and I can certainly understand phases
in one's life. Nevertheless, acknowledging a shortcoming doesn't excuse
it (please don't repeat that to my girlfriend, I don't want it used
against me later <wink>). Besides, you are still assuming that it is
others who need to change. I can assure you that even were that so, it
is an exercise in futility. Until you can cease to react to others, it
is somewhat hypocritical to explain that they shouldn't react to you.
For instance, I'm reasonably sure I can stimulate minor change in your own
static type. You understand enough of the dimensions already that you're
capable of absorbing more of them. But you'll have probably gotten there on
your own soon anyways, so am I really needed as a stimulus?


That would assume that 1) I desire such change, and 2) that such change
would be desirable.
Actually, I suspect I'm not for your benefit, if I am for anyone's benefit.
I suspect it is for someone who is lurking. Maybe someone who hasn't formed
much of a world view yet. Mine, at any rate, is in my .sig.


Well, certainly there is some benefit for someone, but I'm less than
certain that the lesson is necessarily what you would expect.
Regards,

Jul 18 '05 #106
On Tue, 2003-08-19 at 06:36, Peter van Merkerk wrote:
- GUI and tools support can end up being more important than language niceties.


Absolutely. This is one of the main reasons languages like Ruby and
Haskell aren't in my toolkit. wxRuby appears to be in the pre-alpha
stages and wxHaskell doesn't exist at all.


http://wxhaskell.sourceforge.net


Feh. I should have known =) It seems nearly every language is
gathering a wx interface. What's next, wxC#?

Regards,

--
Cliff Wells, Software Engineer
Logiplex Corporation (www.logiplex.net)
(503) 978-6726 (800) 735-0555
Jul 18 '05 #107
Brandon J. Van Every <va******@3DProgrammer.com> wrote:
- Python is not a good language for low-level 3D graphics problems. C++ is
better.
Well, not just for low-level 3D graphics. There are many other things
you would not code in Python unless you are a complete fool. Your
companies mail server, for instance. But you wouldn't do that in C#
either. Though there probably *are* people or companies who would even
manage to sell such stuff. Wonderful Windows world.
- Python isn't a rapid prototyping godsend for every problem out there.
But let me ask you, what is it that you are actually looking for? A
language for each and "every problem out there"? Can there be such a
thing? I doubt. For me, it's just one out of about three 1/2 languages
I use for my daily work (that is, Python, C/C++, Bourne-Shell, in that
order, plus a bit of Fortran where it really can't be avoided). I find
that Python is very handy and I may say that I now use it for more than
50% of my work. But I wouldn't unnecessarily force myself to use it for
everything.
- Microsoft C# and .NET are strategic threats to Python's prosperity and
vitality, if not survival.
Only time will tell. But let's face it: For Python it will be difficult
to stand against the massively advertised, fashionable M$ products,
that everybody believes (s)he needs. Under Windows, of course. Unix is
a different domain.
- Yet unnamed languages are capable of taking Python's place on the food
chain.
Let's not speculate. Let them come. We'll evaluate then.
- You have to promote your language, grow its market share, to prevent that
from happening.
Granted that my motivation to use Python is different from that of many
others here. I have work to do, and to finish it as quickly as
possible. I figured that Python is a great language for people like me,
who are not full-time programers but who need an easy to learn language
for daily quich-and-dirty scripting. Thus, it doesn't matter to me if
Python has 1 or 10 Million users. Though I am happy about every
colleague I can "convert". Because I know from experience that it's a
good thing and that for others it may be as good as it has proven for
myself. That doesn't necessarily mean that there is nothing better out
there. In the environment that I am working in (a M$ free one), Python
is a good choice. Possibly Ruby would be about as good, and despite of
having looked at both, I liked Python a lot more, but that's more a
matter of my personal taste. I also used Perl from time to time but
never got to like it. In contrast to Python.
These analyses are called "sanity" in the Windows world. It's Reality On
The Ground. It's really odd to come to a developer community where people
are insular and freak out about these kinds of pronouncements. It makes
Windows people wonder if Python people are capable of industrial relevancy.
As stated, I don't care. If one day I have to learn C# or .NET, it'll
be early enough. But unless I am forced to, I happily stay away from
it.
The rest of you: wake up! Smell the Windows!


I never particularly liked that stench. Do you?

BTW: What exactly has "Smell the Windows" to do with the current Ruby
vs. Python debate? Are you actually serious about learning any of
these, or are you trying to convince yourself that it was right not to
ever try them out?

Your attitude reminds me of somebody not wanting to try the taste of
say, cheese, because it smells so bad, doesn't look great, etc. But who
always goes into the swiss restaurants asking around "What's so good
about cheese, why don't you better eat potatoes?" What will the people
tell that guy?

I am not saying that you are trolling, and in the recent discussions
you actually made some good points. But you obviously lack the sincere
will to actually become able to make a competent judgement yourself.
That's annoying.

Ramón
--
Horum omnium fortissimi sunt Costarricenses
Jul 18 '05 #108
jj*@pobox.com (John J. Lee) writes:
read everything slowly, chew the cud, *then* start writing. I read
the whole of Stroustrup and a couple of other books before writing a


I couldn't get past the first chapter of Stroustrup... Maybe because
I'd already been working with C++, it read like a pile of
justifications for what I consider a bastardization of an elegant
foundation. (So, I like C but dislike C++, what can I say?)

Nick

--
# sigmask || 0.2 || 20030107 || public domain || feed this to a python
print reduce(lambda x,y:x+chr(ord(y)-1),' Ojdl!Wbshjti!=obwAcboefstobudi/psh?')
Jul 18 '05 #109
"Alex Martelli" <al*****@yahoo.com> wrote in message
news:bh*********@enews3.newsguy.com...
Thus, the direct Python equivalent might be

import __builtins__
__builtins__.len = someotherfunction

and THAT usage is very specifically warned against e.g. in
"Python in a Nutshell" -- you CAN do it but SHOULDN'T (again, one
can hope to eventually see it "officially" deprecated...!).


I agree with your position, more or less, but I am curious - how far do you
think this deprecation should go? Just changing a function in __builtins__?
Changing a function in any module? Or changing *anything* about a module
from the outside?

Thanks,
Dave

Jul 18 '05 #110
On 18 Aug 2003 16:38:42 -0400, aa**@pythoncraft.com (Aahz) wrote:
Oh, come on, Brandon is *much* less of a troll than T*m*t*y R*e.


Dangerously close to my name -- I hope it's not a typo and meant to
/be/ my name!
Jul 18 '05 #111
On 18 Aug 2003 22:17:12 +0100, jj*@pobox.com (John J. Lee) wrote:
O'Caml, not to mention a bunch of others -- if only Python didn't do
almost everything so well, there'd be more motivation...


Yep, that's the one. I was up and running producing useful code in
Python within hours, and though I'd go to other languages for
particularly specialised tasks Python does pretty much everything I
want, does it well and does it easily. On the other hand, I've tried
a few times to learn Ruby and never got to the point at which I could
produce anything useful. I think it's because I've never used Perl
and have not used Un*x much. Ruby, it seems to me, very much shows
those roots and so seems to me to have less of a general appeal.
Jul 18 '05 #112
On Mon, 18 Aug 2003 21:48:13 -0400, Lulu of the Lotus-Eaters
<me***@gnosis.cx> wrote:
That said, I feel a little bad for Tim Rowe. He's posted many
interesting and relevant articles. But for the last year or so, my
finger is always halfway to the delete button before I realize that a
post is by this nice guy with an unfortunately similar name :-).


Phew! Thanks for that. And it's a relief, because I was unaware of my
(near) namesake and I was worried I had done something horrid! Would
it help if I configured my newsreader to put my middle initial ("G")
in?
Jul 18 '05 #113
Alex Martelli <al*****@yahoo.com> writes:
Alexander Schmolck wrote:
...
To recap: usually, if I change a class I'd like all pre-existing instances
to become updated (Let's say you develop your program with a running
If you CHANGE a class (mutate a class object), that does indeed happen.


Indeed. But see below.

You seem to have some deep level of confusion between "changing an
object" and "rebinding a name (that happened to be previously bound
to a certain object) so it's now bound to a different object".
I don't think I'm confused about this.

Rebinding a name *NEVER* has any effect whatsoever on the object that
might have, by happenstance, previously bound to that name (except that,
when _all_ references to an object disappear, Python is free to make
that object disappear whenever that's convenient -- but that doesn't
apply here, since if a class object has instances it also has references
to it, one per instance).
Is your assumption of my ignorance regarding the difference between
mutation/rebinding maybe based on the implicit assumption that a class
statement by some sort of logical necessity *has* to create a new class object
and then bind a name to it?

Is there a reason why a class X(...):... statement can't just mutate a
preexisting, former version of the class X (provided there is one?). Maybe
there is, in which case let me state that I'm not particularly hung up on
this, what I'm hung up about is the ability to easily update the behavior of
pre-existing instances.

How exactly this happens (by mutating the class in question (by whatever
means, a new class statement, some function calls, __dict__-assignment), by
rebinding the .__class__ for all instances etc.) is of less importance to me
(although I suspect the 2nd alternative would cause all sorts of problems,
because the old (viz non-'id'entical) class object could also hang around in a
couple of other places than .__class__, .__bases__ etc.).

This rule is universal in Python and makes it trivially simple to
understand what will happen in any given occasion -- just as long
as you take the minimal amount of trouble to understand the concepts
of names, objects, binding (and rebinding) and mutation. If a language
has no such clear universal rule, you're likely to meet with either
deep, complex trouble, or serious limitations with first-classness
of several kinds of objects. In Python everything is first-class,
and yet there is no substantial confusion at all.

Repeat with me: rebinding a name has just about NOTHING to do with
mutating an object. These are VERY different concepts. Until you
grasp them, you won't be a very effective programmer.
Agreed, but I think that happened some time ago.
python session; you notice a mistake in a method-definition but don't want
to start over from scratch; you just want to update the class definition
and have this update percolate to all the currently existing instances.
I assume that by "update the class definition" you mean that you want
*necessarily* to use some:

class X: ...

statement that defines a NEW class object (which may happen to have
the same name as an existing class object -- quite irrelevant, of
course).


I simply want some effective mechanism to have changes in selected parts of my
sourcecode adequately reflected in a running interactive session, without
much fuzz.

I'm happy to write some elisp and python code to do some extra work on top of
py-execute-region on a key-press. By "adequately reflected" I mean amongst
other things that I can easily change the behavior of existing instances of a
class after I made changes to the source code of that class.
OK, then, so arrange (easily done with some function or
editing macro, if you have a decent editor / IDE) to do the following:

oldX = X # save the existing classobject under a new temporary name

class X: ... # define a completely new and unrelated classobject

oldX.__dict__.clear()
oldX.__bases__ = X,

# in case you also want to change the old class's name, you might:
oldX.__name__ = X.__name__ # so you're not constrained in this sense

# optionally, you can now remove the temporary name
del oldX

There. What's so hard about this? How does this fail to meet your
desiderata?
I tried this some time ago and discovered it only worked for old style
classes; the dict-proxies of new-style classes are, I think, not directly
mutable, I'd guess for efficiency reasons.

Here is an example of the difference (update, clear, del won't work either,
BTW):
class X: .... def bar(): "bar"
.... X.__dict__['bar'] = lambda :"foo"
class X(object): .... def bar(): "bar"
.... X.__dict__['bar'] = lambda :"foo"

TypeError
TypeError: object does not support item assignment
Wouldn't it have been easier to ask, in the first place, "I would like to
obtain this effect, is there a Python way" (being a BIT more precise in
describing "this effect", of course !-), rather than whining about "Python
can't do this" and complaining of this as being "unreasonable"?
Agreed, this was an unproductive way to describe my problem. Mea culpa.

Although I have investigated the problem before (and for the reason stated
above, settled for the suboptimal reassigning to .__class__ when I
*desperately* wanted to change a important instances), on second thoughts not
only should I have phrased myself differently, I should also have spent more
time thinking about it.

I *think* it ought to be possible to effect all the necessary changes by
directly manipulating attributes, rather than going via .__dict__, even for
class/staticmethods.

It's too late now, so I haven't fully thought this through yet but I'll give
it a try tomorrow.

Anyway, I still don't see a compelling reason why class statements
couldn't/shouldn't be mutating, rather than rebinding. Is there one?

AFAIK doing this in a general and painfree fashion is pretty much
impossible in python (you have to first track down all instances -- not an
I personally wouldn't dream of doing this kind of programming in production
level code (and I'd have a serious talk with any colleague doing it -- or
changing __class__ in most cases, etc).


I'd be interested to hear your reasons. (The only argument against this style
that I can think off from the top of my head is that it might encourage sloppy
and code development and unwitting reliance on irreproducible artifacts of the
session history.

I don't think this problem occurs in practice, however. If I write
(non-throwaway) code, I always write plenty of testcode, too, (which, BTW I
also find *much* more pleasant to develop and debug interactively) and
frequently start new python processes to to run this testcode in fresh
sessions to satify myself that things just don't appear to work because of
accumulated garbage in my interactive session.

[snipped] I don't think the fact that nobody complains is strongly connected to the
fact that most Python users grasp the possibility of updating a class by
modyfying its __dict__ etc; indeed, one great thing about Python is that
most users typically tend to SIMPLICITY rather than "cleverness" -- if they
wanted to modify a class X's method Y they'd code it as a function F and
then simply do X.Y=F -- extremely simple, rapid, effective, and no need
to much with redoing the whole 'class' statement (a concept I personally
find quite terrible).
I think we need to sharply distinguish between two things (and I should
presumably have stressed this more in my last post):

1. Redefining methods of class during execution, as part of the solution to
the problem the program is supposed to adress.

2. Redefining classes as part of an incremental development process. In this
case, you of course "redo" the whole class statement, because you edit the
source code with the class statement and everything else in it and then the
most convinient way to update your code is to send all or part of this
newly edited file to your interactive session (plus potentially some
tweaking code that mutates the pre-existing "live" version of X as you
outlined above, or somehow else effects the changes you want to take place
in existing instances).

I'm chiefly interested in 2. (to the extent that I consider it an essential
ability of a good programming language) and much less so in 1. (although
modifying instances and classes can also sometimes be useful).

Rarely does one want to modify an existing class object any more 'deeply'
than by just redoing a method or two, which this utterly simple assignment
entirely solves -- therefore, no real problem.

If your style of interactive programming is so radically different from
that of most other Pythonistas, though, no problem
(I wonder whether it is (quite possibly) and if so why? Developing code
interactively just seems like the obvious and right thing to do to me.)
put some effort in understanding how things work (particularly the
difference between *MODIFYING AN OBJECT* and *REBINDING A NAME*), Python
supports you with power and simplicity in MOST (*NOT* all) metaprogramming
endeavours.

If you DO want total untrammeled interactive and dynamic metaprogramming
power, though, *THEN* that's an area in which Ruby might support you
even better
No it doesn't, for example Ruby doesn't have docstrings which is an
unnecessary waste of my time (apart from the fact that Ruby offends my
sense of esthetics).
-- for example, this Python approach would *NOT* work if
oldClass was for example str (the built-in string class -- you CANNOT
modify ITS behavior). Personally, for application programming, I much prefer clear and
well-defined boundaries about what can and what just CANNOT change. But if
you want EVERYTHING to be subject to change, then perhaps Ruby is exactly
what you want.
Nope, in that case I'd almost certainly use smalltalk which I consider
infinitely superior to ruby (or possibly some Lisp).
Good luck with keeping track of what names are just names (and can be freely
re-bound to different objects) and which ones aren't (and are more solidly
attacked to the underlying objects than one would expect...).


I don't think we really fundamentally disagree. I don't think one should screw
around with everything just because one can and I also think that languages
who set inappropriate incentives in that direction are less suitable for
production software development.

'as
Jul 18 '05 #114
Nick Vargish <na*******@bandersnatch.org> writes:
jj*@pobox.com (John J. Lee) writes:
read everything slowly, chew the cud, *then* start writing. I read
the whole of Stroustrup and a couple of other books before writing a


I couldn't get past the first chapter of Stroustrup... Maybe because
I'd already been working with C++, it read like a pile of
justifications for what I consider a bastardization of an elegant
foundation. (So, I like C but dislike C++, what can I say?)


I agree. In C++, where you see a book section heading like "Feature
X, Feature Y and Feature Z", it means that X, Y and Z have some
horrendous interactions that you have to commit to memory. In Python,
it means that X, Y and Z are so simple they all fit in one section ;-)

I do wonder if the tight constraint on C++ of being C+extra bits was
ever really justified.
John
Jul 18 '05 #115
"Andrew Dalke" <ad****@mindspring.com> writes:
Alexander Schmolck:
No it isn't. Like every other language I know python sucks in a variety of
ways (only on the whole, much less so), but I don't claim I know how to fix
this with a macro system.


What about the other way around? Make a macro for Lisp or
Scheme which converts Python into the language then evals
the result?


Actually, at least one person has been working on this for scheme (I've never
heard about it again and he targeted about the most useless scheme
implementation around).

One thing that makes such an attempt fairly unattractive for anyone with
finite amounts of time is that python isn't that much use without its
supporting C/C++ modules schemes/lisps suck in the FFI department (every
lisp/scheme has its own way of interfacing to C).

Given how easy it is to parse Python (there are several Python
parsers for Python) and the number of people who have popped
up with Lisp background, I'm surprised no one has done that
for fun. After all, there is Python for C, Java, .Net, and for
Python (PyPy) and variations like Pyrex and Vyper. But
none for Lisp?
Would certainly be interesting.

(I think I remember mention of one some years ago, .. I think
*I* posted that link to c.l.py, but I don't remember when and
can't find it via Google.)
But show me how to write something like CL's series package that way (or
better yet, something similar for transforming array and matrix

manipulations
from some reader-friendly representation into something efficient).


The Boost code for C++ suggests a different way to do the latter.
(I don't think templates are the same as hygenic macros.)


Could you say a little bit more about it? In python I think one could to some
extent use operator overloading and a special expression class, that
simplifies the literal expression the programmer stated, but I guess then one
would then to explicitly request evaluation (and operator overloading isn't
quite powerful enough to easily incorporate 'alien' class-instances, too).

Is the C++ code something along those lines, or different?
'as
Jul 18 '05 #116
John J. Lee, refering to Alex, refering to John J. Lee.:
where the indentation of 'line' indicates line continuation.


I see somebody else already indicated why this isn't so.


Andrew Dalke? I just read that, and didn't see any contradiction of
Greg's idea, just a discussion of it. Or did you just mean 'it isn't
a wart'?


I'm pretty sure he meant me. The use of a ":" is not required by the
language. Guido said it's there to provide extra visual clue about
the end of the line, and Tim added that it makes python-mode and
other such tools easier to write because that makes it easy to
tell when a block starts.

My complaint is that that option means it's harder to tell

if abcdefghijkl+
f()
g()

goes over several lines. Computational tractable, but
poor for readability.
You're wrong, see e.g. http://dict.die.net/undistinguishable/ :
the spellings with the initial "u" and "i" are just synonyms.


:-( Google reports > factor of 10 fewer hits for it than 'in', and
it's not in my little dictionary. I wonder if it's in the OED...


I would shy away from the "undistinguishable" form because

"He lead an undistinguished life"

is more synonymous (in my head - dictionaries not withstanding)
to "bland", "boring", and "uninteresting" while "indistinguishable"
is "cannot be told apart from other items."
As I see others indicated in responses to you, this is highlighted
and recommended in introductory texts. So why shouldn't many users
apply such "big guns"?

[...]

That is indeed strange.


Additionally, I think back to the Smalltalk 80 book I
hazily remember reading, and I recall it also suggesting that
'open' class definitions are a good thing. Ruby's blocks have
strong Smalltalk heritage, which I believe also influenced
Ruby's choice in this.

Andrew
da***@dalkescientific.com
Jul 18 '05 #117
On Tue, 2003-08-19 at 11:50, Alex Martelli wrote:
X.test is not a function -- it's an unbound method; to call it, you
MUST therefore pass it at least one argument, and the first argument
you pass to it must be an isntance of X (or of any subclass of X).


Too true, I think I thought of classmethods/staticmethods when I posted
that... Guess that makes much more sense... It was early in the morning,
forgive me... ;)

And for the rest, I don't find it bad to specify self as a first
parameter, anyway, as I can then always grab it using *args.

But, okay, I'll refrain from posting any code tonight, guess I'll get it
all wrong again. ;)

Heiko.
Jul 18 '05 #118
Alexander Schmolck, responding to me:
from Q import X

class X:
def amethod(self): return 'ah, now THIS is better!'

Should it refine the X in the local namespace or make a new
class?


I'd say redefine X in the local namespace. Have you spotted a problem with

it?

I'm thinking about problems like this

# module a.py
class X:
def a(self): print "qwerty"

# module b.py
from a import X

def redo():
global X
class X:
def a(self): print "asdfg"

# interactively
from b import X, redo

x = X()
redo()
print x.a()
print X().a()
Does the interactive loading of X get a copy of b.X,
which is a copy of a.X?

Because of the 'global X', will the redefinition
of X in redo() change the module definition?
What's the output from the print statements?

In short, it would be the only thing in Python which
rebinds a local variable in-place, and I don't have
good feelings about the consequences.

Andrew
da***@dalkescientific.com
Jul 18 '05 #119
Alexander Schmolck:
One thing that makes such an attempt fairly unattractive for anyone with
finite amounts of time is that python isn't that much use without its
supporting C/C++ modules schemes/lisps suck in the FFI department (every
lisp/scheme has its own way of interfacing to C).
If Parrot gets enough Python working for the challenge, then someone
could write a Lisp-y language targeting Parrot, and take advantage
of the work of others.

For that matter, there's Lisps on the JVM, no? Could support
Jython.
Could you say a little bit more about it? In python I think one could to some extent use operator overloading and a special expression class, that
simplifies the literal expression the programmer stated, but I guess then one would then to explicitly request evaluation (and operator overloading isn't quite powerful enough to easily incorporate 'alien' class-instances, too).

Is the C++ code something along those lines, or different?


Different. The problem with the operator overloading approach is
the creation and description of intermediate objects. If you do

a = b + c * d

then "c*d" makes an intermediary temporary.

Template expressions solve it by providing a description of
how to do the add, early enough that the compiler can optimize
based on the whole express. Eg, for vectors, the compiler
could generate code equivalent to

for (i=0; i<n; i++) {
a[i] = b[i] + c*d[i];
}

Here's an old reference
http://infm.cineca.it/infm_help/parallel/poop/KAY.html

Andrew
da***@dalkescientific.com
Jul 18 '05 #120
"Fredrik Lundh" <fr*****@pythonware.com> writes:
Jacek Generowicz wrote:
Python's firmly in the "always half-open" field (pioneered in print,
to the best of my knowledge, in A. Koenig's "C Traps and Pitfalls",


I seem to recall seeing a scanned-in copy of a hand-written talk by
Dijkstra, on this.


this one?

"Why numbering should start at zero"
http://www.cs.utexas.edu/users/EWD/ewd08xx/EWD831.PDF


Bingo !

Thanks.
Jul 18 '05 #121
On Mon, 18 Aug 2003 18:16:07 -0600, "Andrew Dalke"
<ad****@mindspring.com> wrote:
Doug Tolton
I don't agree at all. Yes when you are defining a macro you are in
essence defining a new mini-language. This is perhaps one of the most
powerful features of Lisp. Programming closer to the application
domain, *greatly* enhances both the readability and the reusability of
code.
For that domain. And rarely does the author of a package,
much less a macro, understand "the domain as understood by other
people" vs. personal understanding.

It depends what you are talking about. If you are talking about
making some large cross industry library I might be inclined to agree,
but when it comes to building good high level abstractions within a
company, this argument doesn't make sense. Any feature has to be used
in the proper context for it to be useful, Macros are also this way.
This topic has come up before. Laura Creighton made several
comments on macros, the most notable of which is:

lac:
] Writing your own Lisp Macro System is better than sex. I
] _know_ -- 18 year old me turned down _lots_ of opportunities
] for sex to go hack on her macro system. Thus if we introduce
] this to the language, I think that it is _inevitable_ that we will
] fragment the Python community into a plethora of mutually
] unintelligble dialects. I don't want this. Thus I don't want a
] macro facility in the language _because_ it would be so cool.
I just don't find that argument compelling. By that logic we should
write the most restrictive language possible on the most restrictive
platform possible (ie VB on Windows) because allowing choice is
clearly a bad thing.

Don't introduce a feature because it would be so cool that everyone
would use it? That's just plain weird.

The Python Core language would still always be controlled by Guido,
but I don't see the problem with a community of people writing cool
macro's for python.

Linux is based on this concept of allowing people to extend the
system, it doesn't seem to have suffered from it.

That doesn't mean it *shouldn't* be available [in Python].
Python is Open Source, how would someone writing a
Macro lock you in? Just don't use the macro.


Another writing from Laura seems relevant:
http://mail.python.org/pipermail/pyt...ay/042102.html

My interepretation - I don't customize my apps, nor even
my .cshrc (except for one alias (alias ls 'ls -l \!* | grep ^d')
an 'unset noclobber', 'set ignoreeof', and the PATH and
LD_LIBRARY_PATH - and I wish I didn't need those)
I don't, because I don't like to think. At least not spend my
time puzzling out slight changes. I like my changes either
none or a lot, that is, use Python as-is or write a converter
(or use another language).


Same argument as above, I don't agree with this logic. Python is a
great language, that doesn't mean it couldn't be better though. If
that were the case development would be cease.

Why do we allow people to write functions even, I mean you have to
learn the syntax for calling them, what the variables are and what
they do. Bah, we should make everyone use only built in functions, if
they want a different one, use a different language. What? It makes
no sense to me.
Just like anything else, Macro's can be over used and abused. However
I maintain that if you don't see the usefulness of macros, you don't
really understand them.


That's not the argument against them. It's that they are too useful,
each person makes their own dialect, the community breaks down
as the different branches do their own thing, and one person's so-
called "Python" code looks different than another's.

So don't allow people to customize the system huh? They why is Python
Open Source? That's the *entire* point of Open Source, so that people
can tweak and customize to their own environment. Do you have any
specific examples that are comparable where customization broke a
community down? This sounds like baseless hypothetical speculation to
me.
I know I am nowhere near as good a language designer as Guido,
Larry Wall, Matz, and the others, though I think I'm pretty decent.
I don't have the essential hubris to say that I know better how
to tweak Python-the-language to fit my own domain.

You are saying you don't know how to tweak a language to fit it your
specific domain better than a general puprose language? And you are
saying you are a pretty good language designer? If you don't know
your specific domain well enough to adapt a general purpose language
to it better than it is already written there are several
possibilities:
1) You don't know your domain that well
2) You work in a very general purpose domain
3) You aren't a very good language designer

Designing a good language is all about designing the right high level
abstractions. Even a medium skilled designer should be able to design
a language that maps better to their specific domain than a general
purpose domain (actually implementing is of course a vastly different
story). The whole point of Macro's though is to allow you to leverage
the facilities the language provides while at the same time abstacting
the common idioms.
Essentially using Python over Machine
language is just using one big ass macro language.


You confuse two meanings of the word 'macro' here.
Any assembly language worth its salt has "macros", which
are pre-assembled sets of code. Use the macro and it
generates the code. But you can't use those macros to
rewrite the actual language like you can with hygenic
macros. It doesn't have the proper tail-biting recursive nature.


I am not talking about Assembly Macros. I was comparing hygenic
macros to the ability to make useful high level abstractions. Python
is an abstraction of Machine Language whereas Macros would allow you
to abstract Python.

You are in essence saying that Python is perfect, that no one could
make a more useful abstraction than it already has, and that saying
that one could do so is hubristic. I reject your argument and your
logic as specious. I think what makes Python so useful is the high
level abstractions it offers. The fact that it let's me do things
that *I* know are right for my domain. That it doesn't make the
assumption that Guido knows best for my domain (because I guarantee
you I know my domain better than Guido does). Python doesn't treat me
like the idiot programmer who can't be given a powerful tool because
it might hurt me. Ultimately this is the basis of Java / C# / Visual
Basic. Don't give the programmer room, he might hurt himself, or
abuse something. That paradigm is filled, there are many languages
that restrict programmers because they might misuse a feature, or they
are just too dumb to get it right. I say fine, leave the languages
like Java / C# / VB to those people, but let's make Python a language
that allows people the room to do it the way it needs to be done, not
so much the way Guido or whoever thinks it should be done.

Just my 2p

Doug Tolton
(format t "~a@~a~a.~a" "dtolton" "ya" "hoo" "com")
Jul 18 '05 #122
me***@gnosis.cx (David Mertz) writes:
Alexander Schmolck <a.********@gmx.net> wrote previously:
|Anyway, I still don't see a compelling reason why class statements
|couldn't/shouldn't be mutating, rather than rebinding. Is there one?

I don't think there is any reason why the statement 'class' COULD NOT do
what you describe. But doing so seems much less Pythonic to me.

In Python, there are a bunch of statements or patterns that ALWAYS and
ONLY binds names. Having something that sometimes binds names, and
other times mutates objects... and you cannot tell the difference
without finding something that may be far away in the code.... well,
that's not how Pythonistas like to think about code.


I agree this argument has merrit but I'm not sure I find it completely
compelling.

What's worse, looking at the top of a file and thinking "Ah so that's what
class Foo does" and

a) overlooking that somewhere below a 2nd class Foo gets created (and one
assumes both with instances that insidously claim to be 'Foos').
b) overlooking that somewhere below a 2nd class Foo gets mutated (and thus all
prexisting instances of it)

?

I think we'd agree that both options are generally undesirable (so one could
conceivably even make class statments with the same name in module scope raise
an exception and have the user jump through some hoops if he *really* wants
something along those lines, like having to define Foo #2 in a function and
have it returned).

But at least for b) I can think of one use-case I find rather compelling,
namely interactive development. I'd also think that the nasty surprise in a)
might well be worse than in b), because at least all instances that claim to
be Foos will behave the same.

As I said, I'm not hung up on this since admittedly b) isn't necessary for
convinient interactive development, which is what I'm really keen on. It just
means that one could paste/send code directly to the shell to achieve the
desired effect, but as long as there is *one general way* to mutate classes,
the IDE/editor etc. could just be made to do the extra work of silently
inserting the required python code for class mutation before sending to the
shell.
'as

Jul 18 '05 #123
On Tue, 19 Aug 2003 13:43:11 +0200, Alex Martelli <al*****@yahoo.com>
wrote:
Doug Tolton wrote:
...
abstractions and better programmer productivity. Why not add Macro's
allowing those of us who know how to use them and like them to use
them. If you hate macros, or you think they are too slow, just don't
use them.
"No programmer is an island"! It's not sufficient that *I* do not use
macros -- I must also somehow ensure that none of my colleagues does,
none of my clients which I consult for, nobody who posts code asking
for my help to c.l.py or python-help -- how can I possibly ensure all
of this except by ensuring that macros ARE NOT IN PYTHON?! "Slow" has
nothing to do with it: I just don't want to find myself trying to debug
or provide help on a language which I cannot possibly know because it
depends on what macros somebody's defined somewhere out of sight. Not
to mention that I'd have to let somebody else write the second edition
of the Nutshell -- if Python had macros, I would have to cover them in
"Python in a Nutshell".


Sadly, I would say that is something that would sway me. I love my
Python in a Nutshell book. I have that baby on my desk and I refer to
it *daily*. Although I'd say that's more extortion than a winning
argument!! :-p

I just don't see Macros as the huge threat you do. As someone
mentioned in a previous post, the option would require to do declare
explicitly that you are using Macros. Why is that such a *huge*
issue? Why is it that the very *existence* of a Macro in a program
would offend your sensibilities?
They don't change the underlying language, they just add a
more useful abstaction capability into the language.
They may not change the "underlying" language but they sure allow anybody
to change the language that is actually IN USE. That is definitely NOT
what I want in a language for writing production-level applications.


I tend to disagree on this point. I like the idea of being able to
treat commonly reused idioms as if they are a part of the language.

I dearly hope that, if and when somebody gives in to the urge of adding
macros to Python, they will have the decency to FORK it, and use an
easily distinguishable name for the forked Python-with-macros language,
say "Monty". This way I can keep editing future editions of "Python in
a Nutshell" and let somebody else write "Monty in a Nutshell" without
any qualms -- and I can keep programming applications in Python, helping
people with Python, consulting about Python, and let somebody else worry
about the "useful abstaction" fragmentation of "Monty". If that "useful
abstaction" enters the Python mainstream instead, I guess the forking
can only be the last-ditch refuge for those of us (often ones who've seen
powerful macros work in practice to fragment language communities and
end up with "every programmer a different language"... do NOT assume that
fear and loathing for powerful macro systems comes from LACK of experience
with them, see also the Laura Creighton posts whose URLs have already
been posted on this thread...) who'd much rather NOT have them. But maybe
moving over to Ruby might be less painful than such a fork (assuming THAT
language can forever stay free of powerful macro systems, of course).
That certainly is one way for it to happen. I must say I certainly am
suprised at your vehemence. I don't think the natural state of human
beings is singular in purpose or design. I think the natural state of
humans is to fracture into competing camps / factions. *Every* human
system of any size has factions of some sort or another. I think the
power of Unix / Linux in general has been in working to allow these
factions to Co-exists peacefully. By trying to prevent factioning
within the community, I think you will ultimately only be successful
in driving people with different viewpoints out of the community.
I have nothing against macros *IN GENERAL*. I just don't want them *in
my general-purpose language of choice for the purpose of application
programming*: they add NOWHERE NEAR ENOUGH PRODUCTIVITY, in application
programming, to even START making up for the risks of "divergence" of
dialects between individuals, groups, and firms. If I was focused on
some other field than application programming, such as experimental
explorations, tinkering, framework-writing, etc, I might well feel quite
otherwise. But application programming is where the big, gaping hole
of demand in this world is -- it's the need Python is most perfectly
suited to fulfil -- and I think it's the strength it should keep focus
and iinvestments on.
Again, I disagree. It appears to me as though you've had some near
death experience with Macros that left a sore taste in your mouth.
Could you elaborate some on what your experience has been that turned
you so definitively sour on Macros?
Alex


Doug Tolton
(format t "~a@~a~a.~a" "dtolton" "ya" "hoo" "com")
Jul 18 '05 #124
In article <s4********************************@4ax.com>,
Doug Tolton <dt*****@yahoo.com> wrote:

I just don't find that argument compelling. By that logic we should
write the most restrictive language possible on the most restrictive
platform possible (ie VB on Windows) because allowing choice is
clearly a bad thing.
It's all about maintaining balance. After all, Python forces you to
format your code in a specific way rather than allowing the freedom of
C/C++ or Perl.
Don't introduce a feature because it would be so cool that everyone
would use it? That's just plain weird.

The Python Core language would still always be controlled by Guido,
but I don't see the problem with a community of people writing cool
macro's for python.


Guido's mantra is readability. If you can come up with a concrete
suggestion for a macro system that won't affect Python's readability,
please propose a PEP. Otherwise, there's not much use arguing about it,
because people won't pay attention.
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

This is Python. We don't care much about theory, except where it intersects
with useful practice. --Aahz
Jul 18 '05 #125
Tim Rowe <tim@remove_if_not_spam.digitig.co.uk> writes:
On 18 Aug 2003 16:38:42 -0400, aa**@pythoncraft.com (Aahz) wrote:
Oh, come on, Brandon is *much* less of a troll than T*m*t*y R*e.


Dangerously close to my name -- I hope it's not a typo and meant to
/be/ my name!


Rest assured, *definitely* no relation. :-)
John
Jul 18 '05 #126
Doug Tolton:
It depends what you are talking about. If you are talking about
making some large cross industry library I might be inclined to agree,
but when it comes to building good high level abstractions within a
company, this argument doesn't make sense. Any feature has to be used
in the proper context for it to be useful, Macros are also this way.
As a consultant, I don't have the luxury of staying inside a singular
code base. By your logic, I would need to learn each different
high level abstraction done at my clients' sites. And given the usual
software engineering experience a chemist or biologist has, those
are unlikely to be good.
I just don't find that argument compelling. By that logic we should
write the most restrictive language possible on the most restrictive
platform possible (ie VB on Windows) because allowing choice is
clearly a bad thing.
The inference is that programming language abstractions should
not be more attractive than sex. Classes, functions, and modules
are not. Arguing as you do is an all-or-nothing approach which
overly polarizes the discussion.
Linux is based on this concept of allowing people to extend the
system, it doesn't seem to have suffered from it.
I don't use Linus's kernel. The machine I have with Linux on it
runs a modified version distributed by a company and with all
the other parts needed to make a useful environment for my
work. And I loath the times I need to recompile the kernel,
even though I actually have done kernel mods on Minix in OS
class back in school.

In a similar vein, the different distributions once upon a time were
very divergent on where files were placed, how startup scripts
worked, which libraries were included, and how things were
configured in general (eg, which libc to use?). If I wanted to
distributed precompiled binaries, I was in a bind because I
would need to ship all the variations, even though it's just for
"Linux".

There's more consensus now on, but it took a lot of time.

In short, my comment is that Linux does allow the diversity,
it did cause problems, and people now decide that that
diversity isn't worth it, at least for most uses. For me as an
applications developer, that diversity just makes my life more
complicated.
Same argument as above, I don't agree with this logic. Python is a
great language, that doesn't mean it couldn't be better though. If
that were the case development would be cease.
What if Python had a marker so people could tell the intepreter that
the next few lines are Lisp code, or Perl, or Tcl, or any other language.
Would the result be more flexible? Yes? More powerful? Yes.
Better? I think not.
Why do we allow people to write functions even, I mean you have to
learn the syntax for calling them, what the variables are and what
they do. Bah, we should make everyone use only built in functions, if
they want a different one, use a different language. What? It makes
no sense to me.
Your argument of an extreme has no weight because it's different
than what I'm saying.

Extra power and flexibility can have bad effects, not just on the
language but on the community built around the language. Software
development is rarely a singleton affair, so a good language should
also optimize the ability for different people to use each others'
libraries.

Functions and modules and objects, based on experience, promote
code sharing. Macros, with their implicit encouragement of domain
specific dialect creation, do not.
So don't allow people to customize the system huh? They why is Python
Open Source? That's the *entire* point of Open Source, so that people
can tweak and customize to their own environment.
Err, no. I use open source because it's cheap, because the tools
are of good quality, because if something breaks I can track down
the problem, and as a risk management bonus, if the project ever
dies, I can still maintain things on my own.

I never, ever, ever, want to get into the case where I'm maintaining
my own private, modified version of Python.
Do you have any
specific examples that are comparable where customization broke a
community down? This sounds like baseless hypothetical speculation to
me.
Lisp.

A language which allows very smart people the flexibility to
customize the language, means there will be many different flavors,
which don't all taste well together.

A few years ago I tested out a Lisp library. It didn't work
on the Lisp system I had handy, because the package system
was different. There was a comment in the code which said
"change this if you are using XYZ Lisp", which I did, but that
that's a barrier to use if I ever saw one.
You are saying you don't know how to tweak a language to fit it your
specific domain better than a general puprose language? And you are
saying you are a pretty good language designer? If you don't know
your specific domain well enough to adapt a general purpose language
to it better than it is already written there are several
possibilities:
1) You don't know your domain that well
2) You work in a very general purpose domain
3) You aren't a very good language designer
4) a small change in a language to better fit my needs has
subtle and far-reaching consequences down the line. Instead,
when I do need a language variation, I write a new one
designed for that domain, and not tweak Python.
Designing a good language is all about designing the right high level
abstractions. Even a medium skilled designer should be able to design
a language that maps better to their specific domain than a general
purpose domain (actually implementing is of course a vastly different
story).
But if Python is HERE ............................... and my domain is HERE
I'm not going to try to force them together.
You are in essence saying that Python is perfect, that no one could
make a more useful abstraction than it already has, and that saying
that one could do so is hubristic.


I looked up 'hubris' just now. It's the wrong word for me to use.

http://dictionary.reference.com/search?q=hubris
hubris: Overbearing pride or presumption; arrogance

I don't mean 'overbearing', I mean perhaps 'confidence'.
'arrogance' is also the wrong word. Something without the
negative overtones.

Andrew
da***@dalkescientific.com
Jul 18 '05 #127
Alexander Schmolck wrote:
me***@gnosis.cx (David Mertz) writes:
Alexander Schmolck <a.********@gmx.net> wrote previously:
|Anyway, I still don't see a compelling reason why class statements
|couldn't/shouldn't be mutating, rather than rebinding. Is there one?

I don't think there is any reason why the statement 'class' COULD NOT do
what you describe. But doing so seems much less Pythonic to me.

In Python, there are a bunch of statements or patterns that ALWAYS and
ONLY binds names. Having something that sometimes binds names, and
other times mutates objects... and you cannot tell the difference
without finding something that may be far away in the code.... well,
that's not how Pythonistas like to think about code.
I agree this argument has merrit but I'm not sure I find it completely
compelling.


The counter-arguments you present in the following do not affect Mertz's
argument in the least, and thus cannot indicate why you don't find it
completely compelling. To rephrase David's argument: simplicity suggests
that a 'class' statement should always have the same fundamental semantics.
Since it SOMETIMES needs to bind a name, therefore, letting it ALWAYS
bind a name -- rather than sometimes yes, sometimes no, depending on
context -- is the only Pythonic approach. In other words,

class X: ...

ALWAYS means the same as:

X = <some metaclass>('X', (), <some dict>)

(for appropriate values of <some metaclass> and <some dict>) -- no ifs,
no buts. Can't be simpler. In particular, it's a LOT simpler than the
alternative semantics of

if <name X already defined in this scope>:
...something...
else:
...something totally different...

and I don't see you addressing this claim of greater simplicity in the
rest of your post. So, since you say you DON'T agree with this, could
you perhaps elaborate?
Now, regarding the rest of your post:
What's worse, looking at the top of a file and thinking "Ah so that's what
class Foo does" and

a) overlooking that somewhere below a 2nd class Foo gets created (and one
assumes both with instances that insidously claim to be 'Foos').
b) overlooking that somewhere below a 2nd class Foo gets mutated (and thus
all
prexisting instances of it)

?

I think we'd agree that both options are generally undesirable (so one
It's definitely undesirable to totally overlook either a rebinding or
a name, OR a mutation of a mutable object -- it makes no difference if
you're overlooking these crucial things for a class, a function, or what.
So, if you believe such distractions are frequent:
could conceivably even make class statments with the same name in module
scope raise an exception and have the user jump through some hoops if he
*really* wants something along those lines, like having to define Foo #2
in a function and have it returned).
....you're basically making a case for functional programming (perhaps of
the "single-assignment" variety): no object ever mutates, and each name
is bound ONCE only -- no rebinding. Undoubtedly such languages do in
fact eliminate the mistake of somebody thinking that...:

x = [1, 2, 3]
...much code overlooked...
if len(x) > 3:
<code the guy thinks will never execute>

in that either a rebinding of name x, such as
x = [6, 7, 4, 3]
or a mutation of the object, such as
x.append(23)
could in fact cause the code guarded by that 'if' to execute.

On the other hand, many people believe that functional programming is
hard -- that mutating data and rebinding names leads to a much easier
and free-flowing "style". Python (like most mainstream languages)
definitely accepts the "many people"'s opinion, and does allow mutation
of mutable objects (such as lists, clases, &c) and rebinding of names.
So does Ruby, etc, etc. It WOULD seriously and deleteriously impact
the concept of *FIRST-CLASSNESS OF ALL OBJECTS*, of course, if it was
any harder to rebind a name that happens to be bound to a class object
at some point, that one which happens instead to be bound to a list,
a function, or what-have-you; *THAT* wishy-washy choice would be a
terrible language design, while both Python's current design AND that
of (e.g.) Erlang are quite self-consistent even though opposites.

But this has nothing to do with any presumed desirability of statements
that do quite different things depending on what names happen to
be bound at the time said statements execute, of course.

But at least for b) I can think of one use-case I find rather compelling,
namely interactive development. I'd also think that the nasty surprise in
a) might well be worse than in b), because at least all instances that
claim to be Foos will behave the same.


Not really, unless you want to remove the limitation about "same scope"
too -- and get into a REAL mess. Each Foo can STILL refer to a
totally different object, just in different scopes, e.g.:

def aFoo(x):
if x%2:
class Foo:
def __str__(self): return 'odd!'
else:
class Foo:
def __str__(self): return 'even!'
return Foo()

someFoos = [aFoo(x) for x in range(8)]
for foo in someFoos:
print foo.__class__, foo

There -- how would you complicate the semantics of the class statement
in order to ensure the property you assert holds, namely "all instances
that claim to be Foos will behave the same"?! No, unless you redesign
and complicate the whole language to VASTLY deeper extent there is just
no way to ensure THAT. And as you go on to admit, that "one use-case"
is anything BUT "compelling" anyway -- modifying existing class objects
is quite sufficient to support an interactive environment that works
the way you'd like, if anybody's interested in doing so.
Alex

Jul 18 '05 #128
Doug Tolton wrote:
...
Linux is based on this concept of allowing people to extend the
system, it doesn't seem to have suffered from it.
Linus Thorvalds sits at the center and rejects a LOT of proposed
modifications to the kernel -- everybody's free to distribute such
patches separately, but they're NOT part of Linux. You can do
exactly the same with Python: send patches to Guido, have him
reject them, distribute them separately with no claim they're part
of Python (e.g., Stackless is in exactly this situation).

Powerful macros in Python would *BYPASS* the crucial filtering
role of the chief architect -- Linus or Guido in these cases. And
here's where the "attractive nuisance" side of powerful macros,
the allure that would make oodles of youngsters flock to them,
amplifies their danger: as it's much easier -- and satisfying to
many -- to "play the amateur language designer" by coding macros,
rather than (e.g.) to write device drivers for Linux, the flood
would quite likely be huge.

Same argument as above, I don't agree with this logic. Python is a
great language, that doesn't mean it couldn't be better though. If
that were the case development would be cease.
But WHO will make it better: Guido, whose skill as a language
designer is proven, or a hundred authors of sets of macros? It
is just too easy to "play language designer" -- far more people
will do it than are actually GOOD at language design.

Why do we allow people to write functions even, I mean you have to
Because "once and only once" is the mantra of programming: in a
language that lacks user-written functions or procedures, application
programmers have to copy-and-paste code, a horrid way to program.
learn the syntax for calling them, what the variables are and what
they do. Bah, we should make everyone use only built in functions, if
they want a different one, use a different language. What? It makes
no sense to me.
It doesn't, because the number of different SYNTAX FORMS needed for
powerful expression is incredibly tiny (so that even Python, which is
a small language, can easily afford redundance there!) while the
number of different COMPUTATIONS (functions and procedures) needed for
even a small application exceeds the numbers that can be reasonably
provided as built-ins. In other words, it makes no sense because you
are comparing, not even apples and oranges, but rather cantaloupes and
termites -- completely different things.

So don't allow people to customize the system huh? They why is Python
Open Source? That's the *entire* point of Open Source, so that people
can tweak and customize to their own environment. Do you have any
People can "tweak and customize" "their own environment" in Windows,
too (ever seen TweakUI and friends?!), so if your point was well taken
open-source would not exist. Since it does, it proves your point is
deeply mistaken.

Designing a good language is all about designing the right high level
abstractions. Even a medium skilled designer should be able to design
a language that maps better to their specific domain than a general
I entirely, utterly, totally and completely disagree with this point.

This is like saying that even a medium skilled musician should be
able to write music that plays better to their specific audience than
great music written by a genius who's never personally met any of
the people in the audience: it's just completely false. I want to
use a language designed by a genius, and I want to listen to music
written by Bach, Haendel, Mozart, and the like.

Moreover, judging by the way BY FAR most languages around are in
fact designed, it's abundantly clear that "medium skilled language
designers" are a VERY scarce breed indeed. And yet with powerful
macros everybody and their cousin WILL play the amateur language
designer. No thanks. If you want Dylan, Common Lisp, or Scheme,
you know where to find them. Please leave *ONE* language alone,
with the integrity and conceptual beauty AND usefulness that can
only come from having *ONE* designer -- a genius-level one --
firmly at the helm.

abuse something. That paradigm is filled, there are many languages
that restrict programmers because they might misuse a feature, or they
are just too dumb to get it right. I say fine, leave the languages
like Java / C# / VB to those people, but let's make Python a language
that allows people the room to do it the way it needs to be done, not
so much the way Guido or whoever thinks it should be done.


Let's leave Python as *ONE* language, WITHIN which everything does
work as you say -- not a *MYRIAD* subtly incompatible languages, each
partly designed by a different guys, mostly mediocre at language
design. Just as many languages are overly restrictive, so many
others are overly permissive (see the above mentioned examples)
thanks to powerful macro systems. PLEASE leave Python alone at the
SWEET SPOT, at JUST THE RIGHT COMPROMISE -- neither too permissive
nor too restrictive. GvR's genius (and/or luck) made it that way;
don't branch the language into a zillion mediocre ones.
Alex

Jul 18 '05 #129
In article <87************@pobox.com>, John J. Lee <jj*@pobox.com> wrote:

Rest assured, *definitely* no relation. :-)


No relation to what?
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

Let's see if anyone can identify the source of this joke....
Jul 18 '05 #130
John J. Lee wrote:
Nick Vargish <na*******@bandersnatch.org> writes:
jj*@pobox.com (John J. Lee) writes:
> read everything slowly, chew the cud, *then* start writing. I read
> the whole of Stroustrup and a couple of other books before writing a


I couldn't get past the first chapter of Stroustrup... Maybe because
I'd already been working with C++, it read like a pile of
justifications for what I consider a bastardization of an elegant
foundation. (So, I like C but dislike C++, what can I say?)


I agree. In C++, where you see a book section heading like "Feature
X, Feature Y and Feature Z", it means that X, Y and Z have some
horrendous interactions that you have to commit to memory. In Python,
it means that X, Y and Z are so simple they all fit in one section ;-)

I do wonder if the tight constraint on C++ of being C+extra bits was
ever really justified.


I think it was: it allowed C++ to enter into MANY places that just
wouldn't have given it a thought otherwise, and to popularize OO
in this -- albeit indirect -- way.
Alex

Jul 18 '05 #131
Doug Tolton:
explicitly that you are using Macros. Why is that such a *huge*
issue? Why is it that the very *existence* of a Macro in a program
would offend your sensibilities?
Since you reject the reasons both Alex and I give against macros,
it's kinda hard to come up with another one which you will accept.
I tend to disagree on this point. I like the idea of being able to
treat commonly reused idioms as if they are a part of the language.
A basic idea of Python is that Python is not enough for everything.
I need to write some extensions for Python in C.

If you accept that Python is not a language unto itself, then you
open yourself to the idea that if you want a new idiom you could
add it to your implementation of Python. If you really, really think
that list.sort() should return the sorted list, then it's an easy change
to the C code. If you want a if-the-else expression, then you can
get the patch for just that from sourceforge and apply it to your
copy.

C, then, is CPython's macro language.

Why not take that approach? If your idiom is that commonly
used, won't be be worth your while?

I don't think the natural state of human
beings is singular in purpose or design. I think the natural state of
humans is to fracture into competing camps / factions. *Every* human
system of any size has factions of some sort or another. I think the
power of Unix / Linux in general has been in working to allow these
factions to Co-exists peacefully. By trying to prevent factioning
within the community, I think you will ultimately only be successful
in driving people with different viewpoints out of the community.
The 'natural' size of those groups is a tribe, with about 30-50 people.
Humans lived that way for a long, long time. As I understand it, the
hunter/gatherer cultures were pretty violent. If you aren't a relative
you're the enemy.

That changed, and we developed ways to live together, though we
are no longer free to kill people who diss us. Restrictions have proved
helpful.

But computers are different in some respect. There is great freedom,
because there are few negative consequences. (Negative for life or
death, that is.) People are free to band together in small groups,
forgetting that by doing so they ignore certain advantages of scale.

Unix was definitely one of those, despite your assertion. Or have
you forgotten the Unix wars of the 80s? Linux as well, unless you've
forgotten the huge number of Linux distribution companies in the mid-
90s? (I remember the confusion of having to select from Slackware
vs. RedHat vs. ...)

Not having to choose, and depending on certain assumptions, lets
you focus on new things instead of having to think about the basics
all the time.

Of course it's wrong to ignore the basics. But computers are
wonderful (just like math) precisely because it is still possible for
a few people or even one to explore new or forgotten paths.
If you want macros, there's Lisp or Scheme or Dylan. If you want
other idioms, you could add them yourself. And if they are
useful enough, others will (slowly, all too slowly) use them.

Personally, I am rather annoyed that there hasn't been the next
great language. Back in the 80s and 90s I learned quite a few
languages: BASIC -> Pascal -> C -> "unix" (shell&awk&...) ->
Tcl -> Perl -> Python. Each one seemed more and more
powerful and flexible and usable. But I've not seen any new
language since 1995 (when I looked at Python) to tweak my
interest.

I will grant that Lisp or Scheme is the end-all and be-all of
languages. Where's the language between those and Python?
Again, I disagree. It appears to me as though you've had some near
death experience with Macros that left a sore taste in your mouth.
Could you elaborate some on what your experience has been that turned
you so definitively sour on Macros?


Or rather, it appears that Alex has observed problems with macros
(possibly in his projects, possibly from learning about other's
experiences) and decided that the advantages are not worth it.
That doesn't mean he himself needed a near-death experience.

It is possible to have a well-formed opinion that grabbing a live
power line which fell during a storm is a bad idea without having
done it before. That's another one of those things that let us have
a multi-billion population civilization ;)

Andrew
da***@dalkescientific.com

Jul 18 '05 #132
Doug Tolton wrote:
On Tue, 19 Aug 2003 13:43:11 +0200, Alex Martelli <al*****@yahoo.com>
wrote:
Doug Tolton wrote:
...
abstractions and better programmer productivity. Why not add Macro's
allowing those of us who know how to use them and like them to use
them. If you hate macros, or you think they are too slow, just don't
use them.


"No programmer is an island"! It's not sufficient that *I* do not use
macros -- I must also somehow ensure that none of my colleagues does,
none of my clients which I consult for, nobody who posts code asking
for my help to c.l.py or python-help -- how can I possibly ensure all
of this except by ensuring that macros ARE NOT IN PYTHON?! "Slow" has
nothing to do with it: I just don't want to find myself trying to debug
or provide help on a language which I cannot possibly know because it
depends on what macros somebody's defined somewhere out of sight. Not
to mention that I'd have to let somebody else write the second edition
of the Nutshell -- if Python had macros, I would have to cover them in
"Python in a Nutshell".


Sadly, I would say that is something that would sway me. I love my
Python in a Nutshell book. I have that baby on my desk and I refer to
it *daily*. Although I'd say that's more extortion than a winning
argument!! :-p

I just don't see Macros as the huge threat you do. As someone
mentioned in a previous post, the option would require to do declare
explicitly that you are using Macros. Why is that such a *huge*
issue? Why is it that the very *existence* of a Macro in a program
would offend your sensibilities?


I'm curious. Why do you feel such a need for macros? With metaclasses,
etc., etc., what significant advantage would macros buy you? Do you have
any examples where you think you could make a significantly crisper and
easier to read and understand program with macros than without.

Chris

----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Jul 18 '05 #133
Doug Tolton:
You are saying you don't know how to tweak a language to fit it your
specific domain better than a general puprose language? And you are
saying you are a pretty good language designer?


I've been thinking about that.

Any language designer can design a language.

A good designer can add useful features.

A great designer knows when to keep features out of
the language.
I maintain that I am a good language designer, but not a
great one. There are enough average or mediocre designers
(law of large numbers almost guarantees a bell curve) that
I prefer ways to keep them from meddling into what I do.
Only somewhat facetiously: "Go play somewhere else
and don't bother me until you've learned something."

Facetious because anyone who knows me knows that
I enjoy explaining how things work and how they got to
be that way. It's the inability to get into another's shoes -
to understand view different than one's own - which annoys
me the most. A deficiency sadly typical of all too many
enthusiastic new language designers.

Andrew
da***@dalkescientific.com
Jul 18 '05 #134
Alex:
Similarly, you do not tend to see in the
Linux community people who are convinced with sufficiently high
intensity that case sensitivity in a filesystem is idiocy (I do believe
that, but not intensely enough to drop Linux's other advantages:-).
And fewer who accept, much less understand, Jef Raskin's views
against a hierarchical file system at all. LEAP-LEAP!
To put it another way: I _DO_ have a different viewpoint from the
majority of Python users regarding case sensitivity -- I think it's
a wart in the language.


While I agree that the file system should be case insensitive but
case preserving (making internationalization all that more fun),
I confess to being fond of the

atom = Atom()

idiom. I know it breaks down, eg, for a function which returns
a newly created class, but it's too ingrained in me.

OTOH, I just thought about it some more. It's rare that I
only make a single instance, so

atom1 = Atom()
atom2 = Atom()
new_atom = Atom()

are more common than the 'atom =' version. I'll think
about it as I write new code.

Andrew
da***@dalkescientific.com
Jul 18 '05 #135
aa**@pythoncraft.com (Aahz) writes:
In article <87************@pobox.com>, John J. Lee <jj*@pobox.com> wrote:

Rest assured, *definitely* no relation. :-)


No relation to what?


T*m*t*y R*e.

Did I miss something?
John
Jul 18 '05 #136


Chris Reedy wrote:
Doug Tolton wrote:
On Tue, 19 Aug 2003 13:43:11 +0200, Alex Martelli <al*****@yahoo.com>
wrote:
Doug Tolton wrote:
...

abstractions and better programmer productivity. Why not add Macro's
allowing those of us who know how to use them and like them to use
them. If you hate macros, or you think they are too slow, just don't
use them.


.....

I'm curious. Why do you feel such a need for macros? With metaclasses,
etc., etc., what significant advantage would macros buy you? Do you have
any examples where you think you could make a significantly crisper and
easier to read and understand program with macros than without.


This macro:

(defmacro c? (&body code)
`(let ((cache :unbound))
(lambda (self)
(declare (ignorable self))
(if (eq cache :unbound)
(setf cache (progn ,@code))
cache))))

Let's me write this (for some slot of an instance):

(c? (+ 10 (left self))) ;; self ala smalltalk

Instead of this:

(let ((cache :unbound))
(lambda (self)
(declare (ignorable self))
(if (eq cache :unbound)
(setf cache (+ 10 (left self)))
cache)))

The above macro is a toy version of the real thing, which expands to this:

(make-c-dependent
:code '((+ 10 (left self)))
:rule (lambda (c &aux (self (c-model c)))
(declare (ignorable self c))
(+ 10 (left self))))

Clearly (c? (+ 10 (left self))) is more readable; all the dataflow
wiring is hidden. And the application is more maintainable should I
decide to change the implementation of my dataflow hack.

This kinda thing is when Lispniks use macros, to silently wrap code with
infrastructure necessary to satisfy some frequently arising requirement.

Especially cool above is that I capture the code in symbolic form in a
separate slot for debugging purposes, as well as hand it to the compiler
as the body of the lambda function. Took me way too long to think of
that when I had no idea what lambda was backtracing. But as soon as I
changed the macro, every (c? ) form (and there are hundreds) was debuggable.

I think macros are no harder to learn than an API, and most Lispniks
won't stray to any language that lacks procedural macros (ie, they are
useful), so maybe it comes down to what someone else said in this
thread: Python is not trying to be everything. Fair enough. Let Python
be Python, let Lisp be Lisp.

ie, If someone wants macros, they probably would also like special
variables and closures and lexical scope and multi-methods and they may
as well get it over with and learn Lisp and stop trying make Python more
than it wants to be.
--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #137
In article <87************@pobox.com>, John J. Lee <jj*@pobox.com> wrote:
aa**@pythoncraft.com (Aahz) writes:
In article <87************@pobox.com>, John J. Lee <jj*@pobox.com> wrote:

Rest assured, *definitely* no relation. :-)


No relation to what?


T*m*t*y R*e.
Did I miss something?


Yes, you did, but don't worry about it -- I delight in obscure jokes.
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

This is Python. We don't care much about theory, except where it intersects
with useful practice. --Aahz
Jul 18 '05 #138
Ramon Leon Fournier <mo*********@gmx.net> wrote in message news:<bh************@ID-114614.news.uni-berlin.de>...
Brandon J. Van Every <va******@3DProgrammer.com> wrote:
- Python is not a good language for low-level 3D graphics problems. C++ is
better.


Well, not just for low-level 3D graphics. There are many other things
you would not code in Python unless you are a complete fool. Your
companies mail server, for instance.


Actually, that's not a very good example - Python is *very* well
suited for many types of servers, mail servers included. The I/O heavy
nature of many servers lessens the significance of Python being slow
in terms of raw CPU speed. Lots of I/O can also mean the effects of
the GIL less of a factor on multi-CPU boxes than they otherwise would
be. Finally, given the fact that you don't see too many buffer
overruns and other similar security holes in Python, I'd sleep
*better* at night implementing my server in Python than in C++.

But I do agree with the notion that Python isn't good for *all*
problems, as does everyone else it seems. ;-)

-Dave
Jul 18 '05 #139
jj*@pobox.com (John J. Lee) schreef:
Mind you, I am the type who, when faced with a new language, tends to
read everything slowly, chew the cud, *then* start writing.


Sounds familiar, you're not the only "sick puppy"... ;-)

--
JanC

"Be strict when sending and tolerant when receiving."
RFC 1958 - Architectural Principles of the Internet - section 3.9
Jul 18 '05 #140
"Brandon J. Van Every" <va******@3DProgrammer.com> schreef:
You and I have different social theories. My social theory is, people
are very stubborn. Nobody will engage in Right behavior the minute
you tell them to. But if I killfile people, and tell them why (i.e.
"Because you are a Troll Hunter, and such people are useless."), then
someday they may wake up and figure it out. It may be 6 months from
now, it may be 2 years from now. The point is to have a cumulative
effect on people's newsgroup behavior.


You know this proverb: "change the world, start with yourself" ?

--
JanC

"Be strict when sending and tolerant when receiving."
RFC 1958 - Architectural Principles of the Internet - section 3.9
Jul 18 '05 #141
Alex Martelli <al***@aleax.it> wrote in message news:<Ex********************@news1.tin.it>...
Doug Tolton wrote:
...
Linux is based on...

(...)
... a zillion mediocre ones.
Alex

Macros, as found in Common Lisp, do not change the underlying language
at all! Common Lisp macros, when run, always expand into 100% ANSI
Common Lisp code! Using macros to become more productive is no
different from using function abstractions or class hierarchies to
become more productive. They all require that you, the programmer,
become familiar with them. Macros don't cause Common Lisp to fork
anymore than function or class abstractions do. They only alter the
readability of "program code" (usually for the better), just like
function or class abstractions do.

Saying that all hell will break loose in the Python community seems
rather unfounded and a bit knee-jerk. None of what you claim would
eventually happen within Python circles is currently happening within
the Common Lisp community. After years of macro use, ANSI Common Lisp
is till the same. Macros don't bypass ANSI committees anymore than
they would the Guidos of this world. On the contrary, they preclude
the need to bypass them in the first place, and all parties end up
getting what they need: on the one hand, a static base language, and
on the other, much greater expressiveness.

Speaking of expressiveness, someone asked on comp.lang.lisp.fr what
macros were good for, concretely, and what quantitative difference
they made in commercial applications (cf. "Macros: est-ce utile ?
(attn Marc)"). The responses (in French) were quite enlightening. It
boils down to using multiple macros, in multiple instances, thus
allowing to reduce total code size (of otherwise pure CL code) by VERY
significant margins. You can think of it as reuse (as per OOP) or as
code compression.

Macros do not have to be used all the time or at all. There are times
when a macro should not be used, e.g. when a function would do just
fine. But they are very powerful. As Paul Graham put it, macros allow
you to program up towards the problem at hand, as opposed to adapting
the problem to fit the language specification. They allow greater
expressiveness, when you need it. They allow you to "use" many lines
of code you no longer have to write. And the lines of code you don't
have to write are also the lines of code you don't have to debug (as
it were).

Cheers.
Jul 18 '05 #142
Kenny Tilton wrote:
...
thread: Python is not trying to be everything. Fair enough. Let Python
be Python, let Lisp be Lisp.

ie, If someone wants macros, they probably would also like special
variables and closures and lexical scope and multi-methods and they may
as well get it over with and learn Lisp and stop trying make Python more
than it wants to be.


Hear, hear! Or one you just can't stand the nested-parentheses idea,
then that's what Dylan was designed for: much the same semantics
and power as Lisp, including all of the features you mention!, but with
infix syntax.
Alex

Jul 18 '05 #143
Andrew Dalke wrote:
...
I confess to being fond of the

atom = Atom()

idiom. I know it breaks down, eg, for a function which returns
a newly created class, but it's too ingrained in me.


Back when I gave Eiffel a serious try, I easily slid into [the
equivalent of]:
itsAtom = Atom()
[for an instance member variable -- anAtom for a local,
theirAtom for a class-variable, etc -- Robert Martin's idea
to distinguish those lexically in languages which confuse
the scopes]. In other words, naming a basically-anonimous
"generic instance of class Atom" hardly requires a case
sensitive language, IMHO.
Alex

Jul 18 '05 #144
Andrew Dalke wrote:
...
be that way. It's the inability to get into another's shoes -
to understand view different than one's own - which annoys
me the most. A deficiency sadly typical of all too many
enthusiastic new language designers.


Just as prevalent is the wish to please EVERYone -- that's
how one gets, say, PL/I, or perl... by NOT deliberately refusing
to "get into other's shoes" and rejecting their "different views"
for purposes of inclusion into the new language. Even GvR
historically did some of that, leading to what are now his mild
regrets (lambda, map, filter, ...).
Alex

Jul 18 '05 #145
Andrew Dalke wrote:
...
(Need to use the function instead of a class with __call__
so that the method gets bound correctly. And I believe


You could use a class if its instances where descriptors, i.e.
supported __get__ appropriately -- that's basically what
functions do nowadays. See Raymond Hettinger's HOWTO
on descriptors -- a *wonderful* treatment of the subject.
Alex

Jul 18 '05 #146
Dave Benjamin wrote:
"Alex Martelli" <al*****@yahoo.com> wrote in message
news:bh*********@enews3.newsguy.com...
Thus, the direct Python equivalent might be

import __builtins__
__builtins__.len = someotherfunction

and THAT usage is very specifically warned against e.g. in
"Python in a Nutshell" -- you CAN do it but SHOULDN'T (again, one
can hope to eventually see it "officially" deprecated...!).


I agree with your position, more or less, but I am curious - how far do
you think this deprecation should go? Just changing a function in
__builtins__? Changing a function in any module? Or changing *anything*
about a module from the outside?


Good question! I don't have any preset answers yet. We definitely do
want to impose those restrictions that have "multiple" benefits, i.e.
that both let speed increase (by allowing the compiler to 'inline' the
calls to built-in functions, once it does know they're built-ins) AND
encourage correctness -- but we don't want to cripple ordinary useful
usage, particularly not when the benefits are uncertain. Where the
line should consequently be drawn is not 100% obvious -- which is in
part why nothing about this ended up in Python 2.3, but rather it's all
scheduled for consideration in 2.4.
Alex

Jul 18 '05 #147
Alex:
Even GvR
historically did some of that, leading to what are now his mild
regrets (lambda, map, filter, ...).


and != vs. <>

Can we have a deprecation warning for that? I've never
seen it in any code I've reviewed.

Andrew
da***@dalkescientific.com
Jul 18 '05 #148
Alex:
You could use a class if its instances where descriptors, i.e.
supported __get__ appropriately ... See Raymond Hettinger's
HOWTO on descriptors -- a *wonderful* treatment of the subject.


I'm slowing easing my way into this new stuff. I read it through
once, but it didn't all sink in. Could you post an example of how
to use it for my cache example?

Andrew
da***@dalkescientific.com
Jul 18 '05 #149
Ramon Leon Fournier wrote:
Brandon J. Van Every <va******@3DProgrammer.com> wrote:
- Python is not a good language for low-level 3D graphics problems. C++
is better.
Well, not just for low-level 3D graphics. There are many other things
you would not code in Python unless you are a complete fool. Your
companies mail server, for instance. But you wouldn't do that in C#


I deeply disagree: Twisted, coded in Python, lets you do a GREAT job
coding such applications as mail servers -- highly scalable, etc etc.
And even without Twisted, Python is splendid for such I/O bound jobs.

Granted that my motivation to use Python is different from that of many
others here. I have work to do, and to finish it as quickly as
possible. I figured that Python is a great language for people like me,
I think that's the TYPICAL motivation for using Python: it lets you do
your job with great productivity.

BTW: What exactly has "Smell the Windows" to do with the current Ruby
vs. Python debate? Are you actually serious about learning any of


I dunno: back when I checked carefully, Python was MUCH better integrated
with Windows, thanks to win32all etc etc, while Ruby perched precariously
on cygwin &c. A Windows-centric view would thus appear to favour Python
over Ruby (unless Ruby's Windows implementation has made great strides).
Alex

Jul 18 '05 #150

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

54 posts views Thread by Brandon J. Van Every | last post: by
226 posts views Thread by Stephen C. Waterbury | last post: by
34 posts views Thread by emrahayanoglu | last post: by
1 post views Thread by CARIGAR | last post: by
reply views Thread by zhoujie | last post: by
reply views Thread by suresh191 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.