473,406 Members | 2,847 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,406 software developers and data experts.

What's better about Ruby than Python?

What's better about Ruby than Python? I'm sure there's something. What is
it?

This is not a troll. I'm language shopping and I want people's answers. I
don't know beans about Ruby or have any preconceived ideas about it. I have
noticed, however, that every programmer I talk to who's aware of Python is
also talking about Ruby. So it seems that Ruby has the potential to compete
with and displace Python. I'm curious on what basis it might do so.

--
Cheers, www.3DProgrammer.com
Brandon Van Every Seattle, WA

20% of the world is real.
80% is gobbledygook we make up inside our own heads.

Jul 18 '05
220 18763
Andrew Dalke wrote:
Olivier Drolet:
Macros, as found in Common Lisp, do not change the underlying language
at all! Common Lisp macros, when run, always expand into 100% ANSI
Common Lisp code!

I've created a new language, called Speech. It's based on the core
primitives found in the International Phonetic Alphabet. I've made some
demos using Speech. One is English and another is Xhosa. This just
goes to show how powerful Speech is because it can handle so many
domains. And it's extensible! Anything you say can be expressed in
Speech!


I believe you unwittingly locate an issue. Machine translation of human
languages has been an unescapable project for computer scientists, a challenge
that has consistently revealed harder to achieve than expected. Idiomatic
machine translation of *programming* languages, in comparison, looks like a
toy problem, an appetizer. But all the endless debates in the p.l. newsgroups
certainly show one thing : we don't expect idiomatic translation between
computer languages to solve our problems. While it clearly could.

I believe our reasons for not doing it boil down to : (a) the issue of
*conservation* of programming languages *biodiversity* not having gained
attention as the key issue it is (b) lack of imagination by programmers too
engrossed with pet language advocacy.

What I mean is that the metaphor you use puts the joke on you (or us). You
should really distinguish between the case of translating between *existing*
"sibling" languages (be they human languages or programming languages) and
otoh the case of translating between a newly-bred variant of a language and a
parent language.

Isn't it the case that most objections to macros fail to persist invariant if
we set the purpose of our "macro" facility, to that of *grafting some
language's surface syntax and basic control structures onto some other
language's objects and library ? This, to illustrate and set a positive
criterion, well enough that (say) we will be able to manage with (basically)
the first language's syntax mode under emacs, what will really run as code in
the second language* ?
Jul 18 '05 #151
Chris Reedy wrote:
I'm curious. Why do you feel such a need for macros? With metaclasses,
etc., etc., what significant advantage would macros buy you? Do you have
any examples where you think you could make a significantly crisper and
easier to read and understand program with macros than without.


The first thing I would do with macros in Python is build an inline-facility
which allows certain functions being expanded whenever they are called.
This would hopefully increase speed at the sake of space efficiency.
Historically, macros have been used for building Prolog-like languages,
constrained-programming languages, lazy languages, very flexible object
oriented languages, etc. on top of Lisp.

In newer days, Design Patterns can be coded into the language via macros.
This enables you to program on a higher level of abstraction than with
classes. You can express your patterns more clearly and much shorter than
without them.

Indeed, the C++ community, which does not have macros, uses (some would say:
misuses) their template facility to do metaprogramming. They implement
design patterns, inline numerical code, etc. The resulting template code
is ugly. Compiler messages unreadable. Programming painful. But
developers feel it is worth to do because the abstractions they build are
powerful and run fast.

Having said that, I totally agree with earlier posters who said that macros
in the hand of bad programmers are catastophic. Where I don't agree is
that therefore the language should be more limited than necessary.

Matthias
Jul 18 '05 #152
Alex Martell:
... def __get__(self, obj, cls):
... self.obj = obj
... return self.cached_call
That's the part where I still lack understanding.

class Spam:
def f(self):
pass
f = CachedCall(f)

obj = Spam()
obj.f()

Under old-style Python
obj.f is the same as getattr(obj, "f")
which fails to find 'f' in the instance __dict__
so looks for 'f' in the class, and finds it
This is not a Python function, so it does not
get bound to self. It's simply returned.

obj.f() takes that object and calls it. In my original
code (not shown) I tried implementing a __call__
which did get called, but without the instance self.
Under new-style Python
obj.f is the same as getattr(obj, "f")
which fails to find 'f' in the instance __dict__ so
looks for 'f' in the class, and finds the CachedCall.

Python checks if the object implements __get__,
in which case it's called a descriptor. If so, it's
called with the 'obj' as the first parameter. The
return value of this call is used as the value for
the attribute.

Is that right?
should closely mimic your semantics, including ignoring
what I call obj and you call self in determining whether
a certain set of argumens is cached.


Why should obj make a difference? There's only
one CachedCall per method per .... Ahh, because it's
in the class def, not the instance. Adding support for
that using a weak dict is easy.

Yeah, and my approach won't work with kwargs nor any
other unhashable element. Since I didn't know what the
Lisp code did nor how Lisp handles unhashable elements,
I decided just to implement the essential idea.

Andrew
da***@dalkescientific.com
Jul 18 '05 #153
In article <ql****************@newsread4.news.pas.earthlink.n et>,
Andrew Dalke <ad****@mindspring.com> wrote:
Alex:

Even GvR
historically did some of that, leading to what are now his mild
regrets (lambda, map, filter, ...).


and != vs. <>

Can we have a deprecation warning for that? I've never
seen it in any code I've reviewed.


We will, probably 2.4 or 2.5. (Whenever 3.0 starts getting off the
ground.)
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

This is Python. We don't care much about theory, except where it intersects
with useful practice. --Aahz
Jul 18 '05 #154
Andrew Dalke wrote:
Alex Martell:
... def __get__(self, obj, cls):
... self.obj = obj
... return self.cached_call
That's the part where I still lack understanding.

class Spam:
def f(self):
pass
f = CachedCall(f)


That's an oldstyle class -- use a newstyle one for smoothest
and most reliable behavior of descriptors

obj = Spam()
obj.f()

Under old-style Python
obj.f is the same as getattr(obj, "f")
This equivalence holds today as well -- the getattr
builtin has identical semantics to direct member access.
which fails to find 'f' in the instance __dict__
so looks for 'f' in the class, and finds it
This is not a Python function, so it does not
get bound to self. It's simply returned.

obj.f() takes that object and calls it. In my original
code (not shown) I tried implementing a __call__
which did get called, but without the instance self.
Sure.
Under new-style Python
obj.f is the same as getattr(obj, "f")
Yes.
which fails to find 'f' in the instance __dict__ so
looks for 'f' in the class, and finds the CachedCall.
Sure.
Python checks if the object implements __get__,
in which case it's called a descriptor. If so, it's
Exactly.
called with the 'obj' as the first parameter. The
return value of this call is used as the value for
the attribute.

Is that right?
Yes! So what is it that you say you don't get?

should closely mimic your semantics, including ignoring
what I call obj and you call self in determining whether
a certain set of argumens is cached.


Why should obj make a difference? There's only
one CachedCall per method per .... Ahh, because it's
in the class def, not the instance. Adding support for
that using a weak dict is easy.


If obj is such that it can be used as a key into a dict
(weak or otherwise), sure. Many class instances of some
interest can't -- and if they can you may not like the
result. COnsider e.g.

class Justanyclass:
def __init__(self, x): self.x = x
def compute(self, y): return self.x + y

pretty dangerous to cache THIS compute method -- because,
as a good instance method should!, it depends crucially
on the STATE of the specific instance you call it on.

Yeah, and my approach won't work with kwargs nor any
other unhashable element. Since I didn't know what the
Lisp code did nor how Lisp handles unhashable elements,
I decided just to implement the essential idea.


An automatically cachable method on general objects is
quite tricky. I don't think the Lisp code did anything
to deal with that trickiness, though, so you're right
that your code is equivalent. Anyway, I just wanted to
show how the descriptor concept lets you use a class,
rather than a function, when you want to -- indeed any
function now has a __get__ method, replacing (while
keeping the semantics of) the old black magic.
Alex

Jul 18 '05 #155
"Andrew Dalke" <ad****@mindspring.com> wrote:

[defines cognitive macro called Speech]
I've created a new language, called Speech. It's based on the core
primitives found in the International Phonetic Alphabet. I've made some
demos using Speech. One is English and another is Xhosa. This just
goes to show how powerful Speech is because it can handle so many
domains. And it's extensible! Anything you say can be expressed in
Speech!
[snip lots of examples using it, trying to make macros look bad?]
In short, no one is denying that the ability to create new macros is
a powerful tool, just like no one denies that creating new words is
a powerful tool. But both require extra training and thought for
proper use, and while they are easy to write, it puts more effort
for others to understand you. If I stick to Python/English then
more people can understand me than if I mixed in a bit of Erlang/
Danish, *even* *if* the latter makes a more precise description
of the solution.
You have found a wonderful analogy, however you seem to assume that
your prejudices are so self explanatory that the conclusion that
macros are bad is natural.

I am not a native English speaker, and so my expressiveness in this
language is severely handicapped, while I consider myself a person
with good linguistic abilities.

Obviously there are people from other linguistic backgrounds
participating in discussions in this newsgroup who have the same
problems. Maybe they are greater speakers than me and yet have still
more problems using English.

However this does not in the least cause this newsgroup to deviate
substantially (or maybe it does but as a non-native speaker I can not
discern the difference) from English. Rather we all strive to speak
the same language in order to make as much people as possible
understand what we are saying.

While using Python as a programming language we strive for Pythonicity
and for combining elegance with concisiveness and readability. We are
using docstrings to comment our code and answer questions about it in
the newsgroup. Helpful people debug our code and assist in formulating
our algorithms in Python.

IMO there is a strong tendency towards unification and standardization
among the readers of this newsgroup and the need to conform and the
rewards this brings are well understood.

Seeing all this it would be a missed chance not to give the community
the freedom of redefining the language to its advantage.

Of course there are risks that the community would dissolve in
mutually incompatible factions and it would be wise to slow down the
process according to the amount of responsibility the group can be
trusted with.

The rewards would be incomparably great however, even to the amount
that I would be ready to sacrifice Python only to give this thing a
tiny chance. Suppose you could make a bet for a dollar with an
expected reward of a thousand dollars? Statistically it doesn't matter
whether you get a .999 chance of getting a thousand dollars or a
..00999 chance of getting a million dollars.

Therefore, the only thing pertinent to this question seems to be the
risk and gain assessments.
By this analogy, Guido is someone who can come up with words
that a lot of people find useful, while I am someone who can come
up withs words appropriate to my specialization, while most
people come up with words which are never used by anything
other than close friend. Like, totally tubular dude.


Another relevant meme that is running around in this newsgroup is the
assumption that some people are naturally smarter than other people.
While I can certainly see the advantage for certain people for keeping
this illusion going (it's a great way to make money, the market
doesn't pay for what it gets but for what it thinks it gets) there is
not a lot of credibility in this argument.

The "hardware" that peoples minds are running on doesn't come in
enough varieties to warrant such assumptions. For sure, computer
equipment can vary a lot, but we as people all have more or less the
same brain.

Of course there is a lot of variation between people in the way they
are educated and some of them have come to be experts at certain
fields. However no one is an island and one persons thinking process
is interconnected with a lot of other persons thinking processes. The
idea that some kind of "genius" is solely responsible for all this
progress is absurd and a shameful deviation to the kind of
"leadership" philosophical atrocities that have caused many wars.

To come back to linguistic issues, there's a lot of variation in the
way people use their brain to solve linguistic problems. There are
those that first read all the prescriptions before uttering a word and
there are those that first leap and then look. It's fascinating to see
"look before you leap" being deprecated in favor of "easier to ask
forgiveness than permission" by the same people that would think twice
to start programming before being sure to know all the syntax.

In my case for example studying old latin during high school there was
a guy sitting next to me who always knew the different conjugations
the latin words where in and as a result he managed to get high grades
with exact but uninteresting translations. My way of translating latin
was a bit different, instead of translating word for word and looking
up each form of each separate word (is it a genitivus, ablativus
absolutus, imperativus, etcetera) I just read each word and going from
the approximative meaning of all words put in a sequence of sentences
I ended up with a translation that was seventy percent correct and
that had a lot of internal consistency and elegance. It was usually
enough to get a high enough grade and also some appraisal: "si non e
vero, e ben trovato" or something like that.

What this all should lead to I am not really sure, but I *am* sure
that breaking out of formal mathematical and linguistic and
programmatic rules is the only way to come to designs that have great
internal consistency and that can accommodate for new data and
procedures.

It is sometimes impossible for a language designer to exactly pinpoint
the reasons for a certain decision, while at the same time being sure
that it is the right one.

The ability to maintain internal consistency and the tendency of other
people to fill in the gaps so that the final product seems coherent is
IMO the main reason for this strange time-travel-like ability of
making the right decisions even before all the facts are available.

Well, maybe I have made the same mistake as you by providing arguments
to the contrary of my intention of advocating the emancipation of the
average Python user to the level of language designer.

However if I have done so, rest assured that my intuition "knows" from
before knowing all the facts that this is the way to go, and the
rewards are infinitely more appealing than the risks of breaking up
the Python community are threatening.

One way or the other this is the way programming will be in the
future, and the only question is: Will Python -and the Python
community- be up to the task of freeing the programmers expressiveness
and at the same time provide a home and starting point to come back
to, or will it be left behind as so many other valiant effort's fate
has been?

Anton


Jul 18 '05 #156
Anton Vredegoor wrote:

The ability to maintain internal consistency and the tendency of other
people to fill in the gaps so that the final product seems coherent is
IMO the main reason for this strange time-travel-like ability of
making the right decisions even before all the facts are available.


Wow :)

Jul 18 '05 #157
"Andrew Dalke" <ad****@mindspring.com> writes:
As a consultant, I don't have the luxury of staying inside a singular
code base. By your logic, I would need to learn each different
high level abstraction done at my clients' sites.
The alternative is to understand (and subsequently recognize) the
chunks of source code implementing a given patten for which no
abstraction was provided (often implemented slightly differently in
different parts of the code, sometimes with bugs), each time that it
occurs.

I'd rather use multimethods that implement the visitor pattern.

I'd rather look at multimethods, than at code infested with
implementations of the visitor pattern.

(The above comments are _not_ about the visitor pattern per se.)
The inference is that programming language abstractions should not
be more attractive than sex.
Why ever not? Don't you want to put the joy back into programming :-)
Functions and modules and objects, based on experience, promote
code sharing. Macros, with their implicit encouragement of domain
specific dialect creation, do not.
I don't believe you can reasonably draw a rigid and well-defined
boundary between functions, modules and objects on one side, and
macros on the other. They all offer means of abstraction. All are open
to abuse. All can be put to good use.

In all four cases, I'd rather have the opportunity to create
abstractions, rather than not.

I find your suggestion that macros are in some way more "domain
specific" than modules, or objects or functions, bogus.
A language which allows very smart people the flexibility to
customize the language, means there will be many different flavors,
which don't all taste well together. A few years ago I tested out a Lisp library. It didn't work
on the Lisp system I had handy, because the package system
was different. There was a comment in the code which said
"change this if you are using XYZ Lisp", which I did, but that
that's a barrier to use if I ever saw one.
You are confusing the issues of

- extensibility,
- standard non conformance,
- not starting from a common base,
- languages defined my their (single) implementation.

A few days ago I tested out a C++ library. It didn't work on the C++
system I had handy because the STL implementation was
different/template support was different. etc. etc.

A few days ago I tested out a Python library. It didn't work on the
implementation I had handy because it was Jython.
4) a small change in a language to better fit my needs has
subtle and far-reaching consequences down the line. Instead,
when I do need a language variation, I write a new one
designed for that domain, and not tweak Python.
So, what you are saying is that faced with the alternatives of

a) Tweaking an existing, feature rich, mature, proven language, to
move it "closer" to your domain.

b) Implementing a new language from scratch, for use in a single
domain

you would choose the latter?

If so, you are choosing the path which pretty much guarantees that
your software will take much longer to write, and that it will be a
lot buggier.

It's an extreme form of Greenspunning.

How do you reconcile
when I do need a language variation, I write a new one designed for
that domain, and not tweak Python.
with
Functions and modules and objects, based on experience, promote
code sharing. Macros, with their implicit encouragement of domain
specific dialect creation, do not.


?

You criticize macros for not encouraging code sharing (they do, by
encouraging you to share the (vast) underlying language while reaching
out towards a specific domain), while your preferred solution seems to
be the ultimate code non-sharing, by throwing away the underlying
language, and re-doing it.
Jul 18 '05 #158
Jacek Generowicz wrote:

You criticize macros for not encouraging code sharing (they do, by
encouraging you to share the (vast) underlying language while reaching
out towards a specific domain), while your preferred solution seems to
be the ultimate code non-sharing, by throwing away the underlying
language, and re-doing it.


This criticism can't help looking frivolous, imho. You appear to be confusing
"language" with "speech". But I do believe there *must* exist a sane niche for
(perhaps mutated) macros (in some lisp-like sense).

Cheers, B.

Jul 18 '05 #159


Andrew Dalke wrote:
Kenny Tilton:
This macro:

(defmacro c? (&body code)
`(let ((cache :unbound))
(lambda (self)
(declare (ignorable self))
(if (eq cache :unbound)
(setf cache (progn ,@code))
cache))))

I have about no idea of what that means. Could you explain
without using syntax? My guess is that it caches function calls,
based only on the variable names. Why is a macro needed
for that?


C? does something similar to what you think, but at with an order of
magnitude more power. Estimated. :) Here is how C? can be used:

(make-instance 'box
:left (c? (+ 2 (right a)))
:right (c? (+ 10 (left self))))

So I do not have to drop out of the work at hand to put /somewhere else/
a top-level function which also caches, and then come back to use it in
make-instance. That's a nuisance, and it scatters the semantics of this
particular box all over the source. Note, btw:

(defun test ()
(let* ((a (make-instance 'box
:left 0
:right (c? (random 30))))
(b (make-instance 'box
:left (c? (+ 2 (right a)))
:right (c? (+ 10 (left self))))))
(print (list :a a (left a) (right a)))
(print (list :b b (left b) (right b)))))

....that different instances of the same class can have different rules
for the same slot. Note also that other plumbing is necessary to make
slot access transparent:

(defun get-cell (self slotname) ;; this fn does not need duplicating
(let ((sv (slot-value self slotname)))
(typecase sv
(function (funcall sv self))
(otherwise sv))))

(defmethod right ((self box)) ;; this needs duplicating for each slot
(get-cell box right))

But I just hide it all (and much more) in:

(defmodel box ()
((left :initarg :left :accessor left)
(right :initarg :right :accessor right)))

....using another macro:

(defmacro defmodel (class superclasses (&rest slots))
`(progn
(defclass ,class ,superclasses
,slots)
,@(mapcar (lambda (slot)
(destructuring-bind
(slotname &key initarg accessor)
slot
(declare (ignore slotname initarg))
`(defmethod ,accessor ((self ,class))
(get-cell self ',slotname))))
slots)))


import time
def CachedCall(f):
... cache = {}
... def cached_call(self, *args):
... if args in cache:
... return cache[args]
... x = f(self, *args)
... cache[args] = x
... return x
... return cached_call
...
class LongWait:
... def compute(self, i):
... time.sleep(i)
... return i*2
... compute = CachedCall(compute)
...
t1=time.time();LongWait().compute(3);print time.time()-t1
6
3.01400005817
t1=time.time();LongWait().compute(3);print time.time()-t1

6
0.00999999046326


Cool. But call cachedCall "memoize". :) Maybe the difference is that you
are cacheing a specific computation of 3, while my macro in a sense
caches the computation of arbitrary code by writing the necessary
plumbing at compile time, so I do not have to drop my train of thought
(and scatter my code all over the place).

That is where Lisp macros step up--they are just one way code is treated
as data, albeit at compile time instead of the usual runtime consideration.

--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #160
"Andrew Dalke" <ad****@mindspring.com> writes:
Ndiya kulala. -- I am going for the purpose of sleeping.

And here's an example of Swedish with a translation into
English, which lack some of the geneaological terms

min mormor -- my maternal grandmother

I can combine those and say

Umormor uya kulala -- my maternal grandmother is going
for the purpose of sleeping.

See how much more precise that is because I can select
words from different Dialects of Speech?


You are absolutely right. "Umormor uya kulala" is less readable than
"My maternal grandmother is going for the purpose of sleeping", to
someone who is familiar with English, but unfamiliar with Xhosa and
Swedish.

Now explain the Mor/Far concept and the "going for a purpouse"
concept" to said English speaker, and present him with text it which
combinations of the concepts are user repeatedly.

_Now_ ask yourself which is more readable.

For this reason it is rarely a good idea to define a macro for a
single use. However, it becomes an excellent idea if the idea the
macro expresses must be expressed repeatedly. The same is true of
functions, classes, modules ...
Jul 18 '05 #161


Andrew Dalke wrote:
Olivier Drolet:
Macros don't cause Common Lisp to fork
anymore than function or class abstractions do.

Making new words don't cause Speech to fork any more than
making new sentences does.


Hunh? This one doesn't work, and this is the one you have to answer.

Forget argument by analogy: How is a macro different than an API or
class, which hide details and do wonderful things but still have to be
mastered. Here's an analogy <g>: I could learn Java syntax in a week,
but does that mean I can keep up with someone who has been using the
class libraries for years? Nope.

And Java doesn't even have macros.

In short, no one is denying that the ability to create new macros is
a powerful tool, just like no one denies that creating new words is
a powerful tool. But both require extra training and thought for
proper use, and while they are easy to write, it puts more effort
for others to understand you. If I stick to Python/English then
more people can understand me than if I mixed in a bit of Erlang/
Danish, *even* *if* the latter makes a more precise description
of the solution.


One of the guys working under me had no respect for readability, he just
got code to work, which was nice. I once had to work on his code. In
about half an hour, with the help of a few macros, a great honking mass
of text which completely obfuscated the action had been distilled to its
essense. One could actually read it.

Of course every once in a while you would notice something like "Ndiya",
but if you went to the macrolet at the top of the function you would
just think "oh, right" and get back to the code.

Maybe sometimes these macros could be functions, but then I'd just
call the function Ndiya.

So what is the difference?

btw, I am on your side in one regard: the LOOP macro in Lisp has
phenomenally un-Lispy syntax, so I have never used it. I am slowly
coming around to it being more useful than irritating, but I did not
like having a new /syntax/ invented. (LOOP /can/ be used with
conventional syntax, but I have seen such code only once and I think I
know why they broke the syntax. <g>). So I get that point, but normally
macros do not deviate from standard Lisp synatx.

Ndiya kulala.
--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #162
Borcis <bo****@users.ch> writes:
Jacek Generowicz wrote:
You criticize macros for not encouraging code sharing (they do, by
encouraging you to share the (vast) underlying language while reaching
out towards a specific domain), while your preferred solution seems to
be the ultimate code non-sharing, by throwing away the underlying
language, and re-doing it.
This criticism can't help looking frivolous,


Only in so far as the original thesis is frivolous.
You appear to be confusing "language" with "speech".


I'm not sure what you mean by this.

Are you saying that macros are "language" because you've heard the
buzz-phrase that "macros allow you to modify the language", while
functions, classes and modules are "speech", because no such
buzz-phrases about them abound ?

If so, then you are erecting artificial boundaries between different
abstraction mechanisms. (All IMHO, of course.)
Jul 18 '05 #163


Alex Martelli wrote:
Andrew Dalke wrote:

Alex Martell:
... def __get__(self, obj, cls):
... self.obj = obj
... return self.cached_call
That's the part where I still lack understanding.

class Spam:
def f(self):
pass
f = CachedCall(f)

That's an oldstyle class -- use a newstyle one for smoothest
and most reliable behavior of descriptors

obj = Spam()
obj.f()

Under old-style Python
obj.f is the same as getattr(obj, "f")

This equivalence holds today as well -- the getattr
builtin has identical semantics to direct member access.

which fails to find 'f' in the instance __dict__
so looks for 'f' in the class, and finds it
This is not a Python function, so it does not
get bound to self. It's simply returned.

obj.f() takes that object and calls it. In my original
code (not shown) I tried implementing a __call__
which did get called, but without the instance self.

Sure.

Under new-style Python
obj.f is the same as getattr(obj, "f")

Yes.

which fails to find 'f' in the instance __dict__ so
looks for 'f' in the class, and finds the CachedCall.

Sure.

Python checks if the object implements __get__,
in which case it's called a descriptor. If so, it's

Exactly.

called with the 'obj' as the first parameter. The
return value of this call is used as the value for
the attribute.

Is that right?

Yes! So what is it that you say you don't get?
should closely mimic your semantics, including ignoring
what I call obj and you call self in determining whether
a certain set of argumens is cached.


Why should obj make a difference? There's only
one CachedCall per method per .... Ahh, because it's
in the class def, not the instance. Adding support for
that using a weak dict is easy.

If obj is such that it can be used as a key into a dict
(weak or otherwise), sure. Many class instances of some
interest can't -- and if they can you may not like the
result. COnsider e.g.

class Justanyclass:
def __init__(self, x): self.x = x
def compute(self, y): return self.x + y

pretty dangerous to cache THIS compute method -- because,
as a good instance method should!, it depends crucially
on the STATE of the specific instance you call it on.
Yeah, and my approach won't work with kwargs nor any
other unhashable element. Since I didn't know what the
Lisp code did nor how Lisp handles unhashable elements,
I decided just to implement the essential idea.

An automatically cachable method on general objects is
quite tricky.


Lisp hashtables can key off any Lisp datum, but...
I don't think the Lisp code did anything
to deal with that trickiness,


No, it did not. That snippet was from a toy (and incomplete)
implementation of my more elaborate Cells package. the toy was developed
over the keyboard during a talk I gave on Sunday. The next step would
have been to determine when the closure had best re-execute the code
body to see if the world had changed in interesting ways, but that is a
big step and requires dependency tracking between cells.

Once a Cell is told an input has changed, it re-runs its body to see if
it comes up with a different result, in which case it caches that and
tells other dependents to rethink their caches.

So what is shown in my example is fun but halfbaked. I was just trying
to show how a macro could hide the plumbing of an interesting mechanism
so the reader can focus on the essence.


--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #164
Anton Vredegoor wrote:
...
tiny chance. Suppose you could make a bet for a dollar with an
expected reward of a thousand dollars? Statistically it doesn't matter
whether you get a .999 chance of getting a thousand dollars or a
.00999 chance of getting a million dollars.
This assertion is false and absurd. "Statistically", of course,
expected-value is NOT the ONLY thing about any experiment. And
obviously the utility of different sums need not be linear -- it
depends on the individual's target-function, typically influenced
by other non-random sources of income or wealth.

Case 1: with whatever sum you win you must buy food &c for a
month; if you have no money you die. The "million dollars chance"
sees you dead 99.9901 times out of 100, which to most individuals
means huge negative utility; the "thousand dollars chance" gives
you a 99.9% chance of surviving. Rational individuals in this
situation would always choose the 1000-dollars chance unless the
utility to them of the unlikely million was incredibly huge (which
generally means there is some goal enormously dear to their heart
which they could only possibly achieve with that million).

Case 2: the sum you win is in addition to your steady income of
100,000 $/month. Then, it may well be that $1000 is peanuts of
no discernible use to you, while a cool million would let you
take 6 months' vacation with no lifestyle reduction and thus has
good utility to you. In this case a rational individual would
prefer the million-dollars chance.

Therefore, the only thing pertinent to this question seems to be the
risk and gain assessments.
Your use of 'therefore' is inapproprite because it suggests the
following assertion (which _is_ mathematically speaking correct)
"follows" from the previous paragraph (which is bunkum). The
set of (probability, outcome) pairs DOES mathematically form "the
only thing pertinent" to a choice (together with a utility function
of course -- but you can finesse that by expressing outcome as
utility directly) -- the absurdity that multiplying probability
times outcome (giving an "expected value") is the ONLY relevant
consideration is not necessary to establish that.

Another relevant meme that is running around in this newsgroup is the
assumption that some people are naturally smarter than other people.
While I can certainly see the advantage for certain people for keeping
this illusion going (it's a great way to make money, the market
doesn't pay for what it gets but for what it thinks it gets) there is
not a lot of credibility in this argument.
*FOR A GIVEN TASK* there can be little doubt that different people
do show hugely different levels of ability. Mozart could write
far better music than I ever could -- I can write Python programs
far better than Silvio Berlusconi can. That does not translate into
"naturally smarter" because the "given tasks" are innumerable and
there's no way to measure them all into a single number: it's quite
possible that I'm far more effective than Mozart at the important
task of making and keeping true friends, and/or that Mr Berlusconi
is far more effective than me at the important tasks of embezzling
huge sums of money and avoiding going to jail in consequence (and
THAT is a great way to make money, if you have no scruples).

Note that for this purpose it does not matter whether the difference
in effectiveness at given tasks comes from nature or nurture, for
example -- just that it exists and that it's huge, and of that, only
a madman could doubt. If you have the choice whom to get music
from, whom to get Python programs from, whom to get as an accomplice
in a multi-billion scam, you should consider the potential candidates'
proven effectiveness at these widely different tasks.

In particular, effectiveness at design of programming languages can
be easily shown to vary all over the place by examining the results.

Of course there is a lot of variation between people in the way they
are educated and some of them have come to be experts at certain
fields. However no one is an island and one persons thinking process
is interconnected with a lot of other persons thinking processes. The
Of course Mozart would have been a different person -- writing
different kinds of music, or perhaps doing some other job, maybe
mediocrely -- had he not been born when and where he was, the son
of a music teacher and semi-competent musician, and so on. And
yet huge numbers of other people were born in perfectly similar
circumstances... but only one of them wrote *HIS* "Requiem"...

there are those that first leap and then look. It's fascinating to see
"look before you leap" being deprecated in favor of "easier to ask
forgiveness than permission" by the same people that would think twice
to start programming before being sure to know all the syntax.


Since I'm the person who intensely used those two monickers to
describe different kinds of error-handling strategies, let me note
that they're NOT intended to generalize. When I court a girl I
make EXTREMELY sure that she's interested in my advances before I
push those advances beyond certain thresholds -- in other words in
such contexts I *DEFINITELY* "look before I leap" rather than choosing
to make inappropriate and unwelcome advances and then have to "ask
forgiveness" if/when rebuffed (and I despise the men who chose the
latter strategy -- a prime cause of "date rape", IMHO).

And there's nothing "fascinating" in this contrast. The amount of
damage you can infert by putting your hands or mouth where they
SHOULDN'T be just doesn't compare to the (zero) amount of "damage"
which is produced by e.g. an attempted access to x.y raising an
AttributeError which you catch with a try/except.

Alex

Jul 18 '05 #165


Anton Vredegoor wrote:
IMO there is a strong tendency towards unification and standardization
among the readers of this newsgroup and the need to conform and the
rewards this brings are well understood.


Your comment reminds me of a brouhaha over the legendary IF* macro
(search comp.lang.lisp via Google). A fellow cooked up (IF* [THEN] ...
ELSE or ELSE-IF ... END-IF), and then used it in a big package his
employer released (so you had to go find the macro!). He took a little
heat for that, the gist being, "if you want to use Basic, use Basic."

Unrelated to macros, on Google you'll also see yours truly getting
eviscerated for using camelCase. "Dude, we use hyphens".

So, yeah, yer technically opening up the floodgates, but the social
pressure is pretty effective at keeping Lisp Lispy and would be at
keeping Python...Pythonic?
--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #166
Alex Martelli:
That's an oldstyle class -- use a newstyle one for smoothest
and most reliable behavior of descriptors
Oops! Yeah, forgot that too. I do consider it a (necessary)
wart that different classes have different behaviours.

Is that right?


Yes! So what is it that you say you don't get?


Before this I didn't realize the process of __getattr__
had changed to allow the __get__ to work. I thought
properties were done at assignment time rather than
lookup time, and that the hooks were located in
something done with __getattribute__

After stepping through it, with Raymond's descriptor
next to me, I think I now understand it.

If obj is such that it can be used as a key into a dict
(weak or otherwise), sure. Many class instances of some
interest can't -- and if they can you may not like the
result. COnsider e.g.

class Justanyclass:
def __init__(self, x): self.x = x
def compute(self, y): return self.x + y

pretty dangerous to cache THIS compute method -- because,
as a good instance method should!, it depends crucially
on the STATE of the specific instance you call it on.
But if someone were to use a method call cache on it
then I would have expected that person to know if its
use was relevant.
Anyway, I just wanted to
show how the descriptor concept lets you use a class,
rather than a function, when you want to -- indeed any
function now has a __get__ method, replacing (while
keeping the semantics of) the old black magic.


Yep. Using a class is to be prefered over my
def-with-nested-scope trick.

Andrew
da***@dalkescientific.com
Jul 18 '05 #167
Alex Martelli <al***@aleax.it> wrote in message news:<p0********************@news1.tin.it>...
John J. Lee wrote:
Nick Vargish <na*******@bandersnatch.org> writes:

I do wonder if the tight constraint on C++ of being C+extra bits was
ever really justified.


I think it was: it allowed C++ to enter into MANY places that just
wouldn't have given it a thought otherwise, and to popularize OO
in this -- albeit indirect -- way.
Alex


You're right. The characterization of C++ as a "better C" got it into
a lot of places. It also, unfortunately, resulted in a huge amount of
C++ code full of C idioms and procedural thinking.

So management thinks they're doing object-oriented programming because
they are using an object-oriented language. But the problems of C
become even worse when you do C++ wrong. The result: people end up
thinking this whole 'object-oriented' thing is a bunch of hooey.

Don't get me wrong: you can do great things with C++ if you're an
expert. Problem is, if you're not, you can do tremendous damage.
Jul 18 '05 #168
Kenny Tilton
C? does something similar to what you think, but at with an order of
magnitude more power. Estimated. :) Here is how C? can be used:

[.. some lisp example ...]
You have got to stop assuming that a description in Lisp is
intuitive to me. I don't know anywhere near enough of that
language to know what's normal vs. what's novel.
So I do not have to drop out of the work at hand to put /somewhere else/
a top-level function which also caches,
Didn't you have to "drop out of the work at hand" to make the macro?

I tried to understand the lisp but got lost with all the 'defun' vs
'function' vs 'funcall' vs 'defmodel' vs 'defclass' vs 'destructuring-bind'
I know Lisp is a very nuanced language. I just don't understand
all the subtleties. (And since Python works for what I do, I
don't really see the need to understand those nuances.)
Cool. But call cachedCall "memoize". :) Maybe the difference is that you
are cacheing a specific computation of 3, while my macro in a sense
caches the computation of arbitrary code by writing the necessary
plumbing at compile time, so I do not have to drop my train of thought
(and scatter my code all over the place).
Sure, I'll call it memoize, but I don't see what that's to be prefered.
The code caches the result of calling a given function, which could
compute 3 or could compute bessel functions or could compute
anything else. I don't see how that's any different than what your
code does.

And I still don't see how your macro solution affects the train
of thought any less than my class-based one.
That is where Lisp macros step up--they are just one way code is treated
as data, albeit at compile time instead of the usual runtime

consideration.

The distinction between run-time and compile time use of code is
rarely important to me. Suppose the hardware was 'infinitely' fast
(that is, fast enough that whatever you coded could be done in
within your deadline). On that machine, there's little need for
the extra efficiencies of code transformation at compile time. But
there still is a need for human readablity, maintainability, and code
sharing.

And for most tasks these days, computers are fast enough.

Andrew
da***@dalkescientific.com
Jul 18 '05 #169


Andrew Dalke wrote:
Kenny Tilton
Lisp hashtables can key off any Lisp datum, but...

Bear with my non-existant Lisp knowledge

Suppose the code is modified. How does the hash table
get modified to reflect the change? Written in Python,
if I have

a = (1, 2, (3, [4, 5]))


Lisp is sick. From the hyperspec on make-hash-table, the test for
lookups can be "eq, eql, equal, or equalp. The default is eql." EQUAL
would work for this case. EQL just looks at object identity.

I can't hash it because someone could come by later
and do

a[2][1].append(6)

so the hash computation and test for equality
will give different results.

The next step would
have been to determine when the closure had best re-execute the code
body to see if the world had changed in interesting ways, but that is a
big step and requires dependency tracking between cells.

Ahhh, so the Python code was comparable in power without
using macros?


No, it was completely different, as per my earlier post. What you did is
what is called MEMOIZE (not part of CL, but I saw some Perl refs pop up
when I googled that looking for Paul Graham's memoize code from On
Lisp). My code just calculates once ever! That is why it needs me to
give another talk in which I add dependency tracking and state change
propagation. And even then it does not memoize, tho you have me thinking
and I could certainly make that an option for Cells that need it. Kind
of rare, but it would be a shiniy new bell/whistle for the package.

No, to match the power of my code you need to do:

(let ((b (make-instance 'box
:left 10
:right (c? (* 2 (left self)))))
(print (list (left b) (right b)))

and see (10 20).

You need to then change the source above to say (c? (* 3 (left self)))
and see (10 30). It should be supported by no more than

(defmodel box ()
(left...)
(right...))

Macros are not functionality, they are about hiding the wiring/plumbing
behind neat new hacks, in a way functions cannot because they jump in at
compile time to expand user-supplied code into the necessary
implementing constructs.

Could the same code be written in Lisp using an approach
like I did for Python?
Download Paul Graham's On Lisp, he has a macro in there that hides all
the plumbing you used for cacheing. :) Look for "Memoize".
How would a non-macro solution look
like?
(let ((cache :unbound))
(lambda (self)
(if (eq cache :unbound)
(setf cache (progn (+ 10 (left self))))
cache)))
What's the advantage of the macro one over the non-macro
one? Just performance?


Again, arranging it so necessary wiring is not cut and pasted all over,
cluttering up the code to no good end, and forcing all the code to be
revisited when the implementation changes. ie, They are just like
functions, except they operate at compile time on source code. The
bestest I could do without macros would be:

(make-instance 'box
:left (c? (lambda (self)
(+ 2 (right a)))
....

Where C? becomes a function which returns an instance which has a slot
for the cache. But then, without macros, I have to hand-code the setters
and getters on every such slot. My defmodel macro writes those accessors
silently.

You know, with all due respect, the question really is not if macros are
useful/powerful. They are, and I think even you conceded that. Forgive
me if I am mistaken. At their most powerful they expand into multiple
top-level definitions and even stash info globally to assist
development, such as being able to inspect the source of a closure at
debug-time. They run at compile time and get a look at the source,a nd
can do interesting things functions cannot.

The anti-macronistas are better off with the argument, hey, if you want
Lisp, use Lisp. Let's keep it simple here. the question for you all is
how far you want to take Python.

Since there already is a Lisp -- stable, compiled, ANSI-standardized,
generational GC, etc, etc -- it would be easier to generate FFI bindings
for needed C libs than to play catch up against a forty year lead.
--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #170
Kenny Tilton:
Forget argument by analogy: How is a macro different than an API or
class, which hide details and do wonderful things but still have to be
mastered. Here's an analogy <g>: I could learn Java syntax in a week,
but does that mean I can keep up with someone who has been using the
class libraries for years? Nope.

And Java doesn't even have macros.


So adding macros to Java would make it easier to master?

Andrew
da***@dalkescientific.com
Jul 18 '05 #171
Olivier Drolet:
Really cute intuition pump you've got there, Alex! :-)
Err, it was me, no?
Macros don't require that you change variable names, just the syntax.
Macros in Python would then not be equivalent to new words, but new
constructs. And yes, you need to know their meaning, but is this
effort greater than managing without them? Have you or any opponents
of macros on this news group never seen a context in which the
advantages of macros outweigh their overhead?


Sure. Having macros would have let people experiment with
generators without tweaking the core language. Would have let
people implement their own metaclass mechanisms instead of
waiting until they were added to the C code.

The complaint has never been that there is no context for which
macros cannot provide the better solution.

I think this track of the discussion will go no further. Let's
assume that I am convinced that not only are macros the best
thing since chocolate but that it should be added to Python.

Following Aahz's suggestion -- what would a Python-with-macros
look like?

Andrew
da***@dalkescientific.com
Jul 18 '05 #172
Alex Martelli <al*****@yahoo.com> wrote in message news:<bh*********@enews3.newsguy.com>...
As for me, I have no special issue with "having to specify self" for
functions I intend to use as bound or unbound methods; otherwise I
would no doubt have to specify what functions are meant as methods
in other ways, such as e.g.

def [method] test(what, ever):
...

and since that is actually more verbose than the current:

def test(self, what, ever):
...

I see absolutely no good reason to have special ad hoc rules, make
"self" a reserved word, etc, etc. Python's choices are very simple
and work well together (for the common case of defining methods
that DO have a 'self' -- classmethod and staticmethod are currently
a bit unwieldy syntactically, but I do hope that some variation on
the often-proposed "def with modifiers" syntax, such as


It's especially nice that if you don't want to use 'self', you don't
have to. Comfortable with 'this' from Java and friends?

def test(this, what, ever):
this.goKaboom(what*ever-what)

will work just fine. 'self' is just a ferociously common idiom. But
it's only an idiom. I like that. So even if we used [method] as a
function decorator, I'd still prefer to see 'self' as part of the
signature. I just like knowing where my names come from.

I do like the concept of function/method decorators though. I'll have
to revisit that PEP.

--
J.Shell; http://toulouse.amber.org/
Jul 18 '05 #173
Alex Martelli <al***@aleax.it> wrote:
Anton Vredegoor wrote:
...
tiny chance. Suppose you could make a bet for a dollar with an
expected reward of a thousand dollars? Statistically it doesn't matter
whether you get a .999 chance of getting a thousand dollars or a
.00999 chance of getting a million dollars.


This assertion is false and absurd. "Statistically", of course,
expected-value is NOT the ONLY thing about any experiment. And
obviously the utility of different sums need not be linear -- it
depends on the individual's target-function, typically influenced
by other non-random sources of income or wealth.


Non linear evaluation functions? Other random sources? Seems you're
trying to trick me. I did write statistically, which implies a large
number of observations. Of course people seldom get to experiment with
those kinds of money, but a simple experiment in Python using a random
number generator should suffice to prove the concept.

[snip tricky example cases]
Another relevant meme that is running around in this newsgroup is the
assumption that some people are naturally smarter than other people.
While I can certainly see the advantage for certain people for keeping
this illusion going (it's a great way to make money, the market
doesn't pay for what it gets but for what it thinks it gets) there is
not a lot of credibility in this argument.


*FOR A GIVEN TASK* there can be little doubt that different people
do show hugely different levels of ability. Mozart could write
far better music than I ever could -- I can write Python programs
far better than Silvio Berlusconi can. That does not translate into
"naturally smarter" because the "given tasks" are innumerable and
there's no way to measure them all into a single number: it's quite
possible that I'm far more effective than Mozart at the important
task of making and keeping true friends, and/or that Mr Berlusconi
is far more effective than me at the important tasks of embezzling
huge sums of money and avoiding going to jail in consequence (and
THAT is a great way to make money, if you have no scruples).


And you're eliminating mr. Berlusconis friends out of the equation?
Seems like trick play again to me. Why are there so few famous classic
female philosophers or musicians? Surely you're not going to tell me
that's just because only the very gifted succeed in becoming famous?
there are those that first leap and then look. It's fascinating to see
"look before you leap" being deprecated in favor of "easier to ask
forgiveness than permission" by the same people that would think twice
to start programming before being sure to know all the syntax.


Since I'm the person who intensely used those two monickers to
describe different kinds of error-handling strategies, let me note
that they're NOT intended to generalize. When I court a girl I
make EXTREMELY sure that she's interested in my advances before I
push those advances beyond certain thresholds -- in other words in
such contexts I *DEFINITELY* "look before I leap" rather than choosing
to make inappropriate and unwelcome advances and then have to "ask
forgiveness" if/when rebuffed (and I despise the men who chose the
latter strategy -- a prime cause of "date rape", IMHO).

And there's nothing "fascinating" in this contrast. The amount of
damage you can infert by putting your hands or mouth where they
SHOULDN'T be just doesn't compare to the (zero) amount of "damage"
which is produced by e.g. an attempted access to x.y raising an
AttributeError which you catch with a try/except.


Somehow one has to establish that a certain protocol is supported.
However trying to establish a protocol implies supporting the protocol
oneself. Perhaps not initiating protocols that one doesn't want to see
supported is the best way to go here.

Anton

Jul 18 '05 #174
Roy Smith <ro*@panix.com> writes:
One of the few things I like about C++ is that between const, templates,
and inline, the need for the macro preprocessor has been almost
eliminated.
Har! If anything it has been increased! Boost, a haven for template
experts, has a whole library which formalizes a programming system for
the preprocessor (http://www.boost.org/libs/preprocessor) just so we
can eliminate the nasty boilerplate that arises in our template code.
Still, you see a lot of code which goes out of its way to
do fancy things with macros, almost always with bad effect.


I guess it's a question of how badly you hate maintaining 25 different
copies of similar code. And, BTW, I tried to "just use Python to
generate C++" first and using the preprocessor turns out to be
significantly better.

BTW, the C++ preprocessor is a fairly weak macro system. A higher
level metaprogramming facility that knows more about the underlying
language could be a lot cleaner, clearer, safer, and more expressive.

--
Dave Abrahams
Boost Consulting
www.boost-consulting.com
Jul 18 '05 #175


Andrew Dalke wrote:
Maybe. But look for previous posts of mine on c.l.py to see that
my previous attempts at learning Lisp have met dead ends. I know
how it's supposed to work, but haven't been able to convert what
I see on a page into something in my head.
It's funny you say that. Someone is dropping by shortly to help me get
up to speed on Linux, and I am helping him with Lisp. But the last email
from him said he had been reading Winston&Horn since my talk on Sunday
(which apparently got him off the same "don't-get-it" square you're on)
and he now says he has no idea why he did not get it before.

I wonder if the (funny (syntax)) makes people think there is more there
than there is. Even in C I code functionally:

this( that( x), then (y))

so I do not think it is the functional thang.

BTW, I'm afraid I'm about at the end of my limits for this thread.
I'll only be able to do small followups.


OK. I might be tossing off a fun demo of my C? jobbies for my talk at
the upcoming Lisp conference in New York:

http://www.international-lisp-conference.org/

Maybe I'll preview it here. That might help us understand each other
better it.

--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #176
al************@comcast.net (A. Lloyd Flanagan) writes:

[...]
Don't get me wrong: you can do great things with C++ if you're an
expert. Problem is, if you're not, you can do tremendous damage.


While I take Alex's point about the practicalities of getting people
to use a language (independent of it's actual appropriateness for the
job at hand), I just suspect there must be a better language out there
in the multiverse, which did the job of integrating nicely with C,
allowing incremental improvement of C code *without* actually trying
to *be* C (+more stuff on top). Of course, it's far too late now.
John
Jul 18 '05 #177
Andrew Dalke wrote:

[snip]

The complaint about macros has been their tendency to
increase a single person's abilities at the cost of overall
loss in group understanding. I've heard references to
projects where that didn't occur, but am not swayed
by it because those seem staffed by people with
extraordinarily good programming skills almost never
found amoung the chemists and biologists I work with.


I just took a quick look at the "Revised5 Report on the
Algorithmic Language Scheme". Macros in Scheme5 are called
"hygienic macros", so apparently there are some dirty macros that
we should worry about. My understanding is that macros came to
Scheme slowly, with resistence, with a good deal of thought, and
with restrictions.

"More recently, Scheme became the first programming language
to support hygienic macros, which permit the syntax of a
block-structured language to be extended in a consistent and
reliable manner."

See:

http://www.schemers.org/Documents/Standards/R5RS/HTML/

http://www.schemers.org/Documents/St...5rs-Z-H-3.html

http://www.schemers.org/Documents/St...html#%_sec_4.3

I have the same worry that some others on this thread have
expressed, that a macro capability in Python would enable others
to write code that I would not be able to read or to figure out.

[snip]

Dave

--
Dave Kuhlman
http://www.rexx.com/~dkuhlman
dk******@rexx.com
Jul 18 '05 #178


Dave Kuhlman wrote:
Andrew Dalke wrote:

[snip]
The complaint about macros has been their tendency to
increase a single person's abilities at the cost of overall
loss in group understanding. I've heard references to
projects where that didn't occur, but am not swayed
by it because those seem staffed by people with
extraordinarily good programming skills almost never
found amoung the chemists and biologists I work with.

I just took a quick look at the "Revised5 Report on the
Algorithmic Language Scheme". Macros in Scheme5 are called
"hygienic macros", so apparently there are some dirty macros that
we should worry about.


(defmacro whoops (place &body code)
`(let ((x (random 3)))
(setf ,place (progn ,@code))))

....wreaks havoc if the code refers to an X it had bound to something
else and expected it to be used. otoh:

(defmacro c? (&body code)
`(lambda (self) ,@code))

....is cool because I can do the anaphoric thing and provide the user
with a uniform referent to the instance owning the slot, ala SmallTalk
or C++ with "this".
My understanding is that macros came to
Scheme slowly, with resistence, with a good deal of thought, and
with restrictions.

"More recently, Scheme became the first programming language
to support hygienic macros, which permit the syntax of a
block-structured language to be extended in a consistent and
reliable manner."

See:

http://www.schemers.org/Documents/Standards/R5RS/HTML/

http://www.schemers.org/Documents/St...5rs-Z-H-3.html

http://www.schemers.org/Documents/St...html#%_sec_4.3

I have the same worry that some others on this thread have
expressed, that a macro capability in Python would enable others
to write code that I would not be able to read or to figure out.


Do what I do. Don't look at anyone else's code (if you have to work on
it, rewrite it anyway) and never show your code to anyone.

Naw, c'mon, everyone seems to be conceding macros are powerful. But you
are going to be held back by fear and worry? Well, I am a lispnik, we
always opt for power and damn the torpedos, as when we go for the
productivity win of untyped variables and give up on bugs strong static
typing is supposed to find.

--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #179
"Andrew Dalke" <ad****@mindspring.com> wrote in message news:<Uw*****************@newsread4.news.pas.earth link.net>...
Olivier Drolet:
Really cute intuition pump you've got there, Alex! :-)
Err, it was me, no?


Sorry for the confusion. As a neophite news poster, I got mixed up
with names. Alex Martelli's arguments really got my attention and his
name stuck. Especially his argument regarding linguistic divergence.
Sigh.

(...)

Following Aahz's suggestion -- what would a Python-with-macros
look like?


I presume Aahz' suggestion is "readability". Thats indeed tricky.
Macros can only work consistently in Common Lisp because of
parentheses, but I don't know ennough about the reliability of
Python's delimitors. Dylan macros, for example, are said to differ in
power and flexibility from those of CL, so I'm led to understand. It
must be said that parens in Common Lisp are also essential in ensuring
the ability of a program alter itself or another program. Whence the
notion of Lisp as a "programmable programming language". (This isn't
to say that this is impossible in other programming languages, only
perhaps a bit trickier.)

Common Lisp macros can often significantly improve code readability by
merely reducing the amount of code. This is seen as a desirable
tradeoff, especially in large projects. If code readability is
paramount within the Python community, does the latter generally shy
away from projects in which code complexity is likely to increase
dramatically, i.e. where readability is likely to suffer?

I ask this only because of the fierce opposition the idea of macros
has been receiving here. Linguistic divergence seems to be seen as
leading inexhorably towards a degradation in code readability.

At the end of the day, I understand and respect a community's choice.
I'm in agreement with the view that languages are molded by their
community, and inversely so. The relation between the language, its
practitionners and the problem spaces being addressed is a dynamic
one. In the end, what ever works for the Python community is fine,
even rejecting macros for the sake of readability.
Olivier
Jul 18 '05 #180

"Dave Kuhlman" <dk******@rexx.com> wrote in message
news:bi************@ID-139865.news.uni-berlin.de...
See:

http://www.schemers.org/Documents/Standards/R5RS/HTML/

http://www.schemers.org/Documents/St...5rs-Z-H-3.html
http://www.schemers.org/Documents/St...html#%_sec_4.3
I have the same worry that some others on this thread have
expressed, that a macro capability in Python would enable others
to write code that I would not be able to read or to figure out.


According to the second referenced page ("Background"), not a
completely baseless worry, although the Scheme divergence may have had
nothing to do with macros:

"The first description of Scheme was written in 1975...
Three distinct projects began in 1981 and 1982 to use variants of
Scheme for courses at MIT, Yale, and Indiana University [21, 17,
10]...
As Scheme became more widespread, local dialects began to diverge
until students and researchers occasionally found it difficult to
understand code written at other sites. Fifteen representatives of the
major implementations of Scheme therefore met in October 1984 to work
toward a better and more widely accepted standard for Scheme.
"

Reading the third referenced page on Macros, I notice that the amount
of syntax definition for the macro sublanguage is as large as a
substantial portion (one-third?) of that for core Python (if condensed
to the same density). So, just by definitional bulk, having it in the
language would not be a free ride.

Terry J. Reedy

Jul 18 '05 #181
Doug Tolton wrote:
I just don't find that argument compelling. By that logic we should
write the most restrictive language possible on the most restrictive
platform possible (ie VB on Windows) because allowing choice is
clearly a bad thing.

Don't introduce a feature because it would be so cool that everyone
would use it? That's just plain weird.


The problem is that a macro system that is too powerful can be harmful.

Let's say you write a useful module. Python 3.6 just added a very powerful
macro system, and you use it to, say, write functions with lazy evaluation,
make strings mutable, and write your own flavor of the for-loop. Now I cannot
read your code anymore. A simple function call, or a loop, does not mean what
it used to mean.

One of Python's strengths is that you can create powerful abstractions with
functions and classes. But no matter what you do with these, they will always
be functions and classes that adhere to some common language rules. There is
no way to go "over the top" and change the language proper. Right now I can
read everybody's Python code (unless deliberately obfuscated); this would
change if there were macros that were so powerful that they would change
language constructs, or allow new ones. Maybe they would make your problesm
easier to solve, but Python itself would lose in the long run.

My $0.02,

--
Hans (ha**@zephyrfalcon.org)
http://zephyrfalcon.org/

Jul 18 '05 #182
Alexander Schmolck <a.********@gmx.net> wrote previously:
|Anyway, I still don't see a compelling reason why class statements
|couldn't/shouldn't be mutating, rather than rebinding. Is there one?

I don't think there is any reason why the statement 'class' COULD NOT do
what you describe. But doing so seems much less Pythonic to me.

In Python, there are a bunch of statements or patterns that ALWAYS and
ONLY binds names. Having something that sometimes binds names, and
other times mutates objects... and you cannot tell the difference
without finding something that may be far away in the code.... well,
that's not how Pythonistas like to think about code.

In the first appendix to my book (<http://gnosis.cx/TPiP> for the free
version) I make quite a point of explaining what Python does without the
sort of "convenient fiction" that most intros/summaries use. That is,
usually the difference between binding a name and assigning a value is
sort of brushed over, I guess so previous C, Pascal, Perl, Basic, etc.
programmers won't get freaked out. I know Alexander already knows
this... but heck, why miss some self-aggrandizement?

Yours, David...

X-Shameless-Plug: Buy Text Processing in Python: http://tinyurl.com/jskh
--
mertz@ _/_/_/_/_/_/_/ THIS MESSAGE WAS BROUGHT TO YOU BY:_/_/_/_/ v i
gnosis _/_/ Postmodern Enterprises _/_/ s r
..cx _/_/ MAKERS OF CHAOS.... _/_/ i u
_/_/_/_/_/ LOOK FOR IT IN A NEIGHBORHOOD NEAR YOU_/_/_/_/_/ g s
Jul 18 '05 #183
tr*****@mac.com (Olivier Drolet) wrote:
Common Lisp macros can often significantly improve code readability by
merely reducing the amount of code.


If what you're talking about is basicly refactoring, then it seems like
you could get the same code reduction by defining new functions/methods.
What does a macro give you that a function doesn't?

In C, the answer was "faster code", which I claim is simple a non-issue
for Python (we're not after speed in the same way C guys are).

In C++ the answer seems to be "generic programming", in the sense that
templates let you factor out the data type from the algorithm. Again, a
non-issue in a dynamic language like Python, where type information is
carried in the object, not the container.

So, is there something else that macros buy you that I'm not seeing?
People keep talking about how lisp macros are nothing like C/C++ macros.
OK, I'm willing to be educated. How are they different? Can somebody
give an example? Keep in mind that the last time I did any serious lisp
was about 20 years ago.
Jul 18 '05 #184
Juha Autero <Ju*********@iki.fi> wrote:
Alex Martelli <al***@aleax.it> writes:
The counter-arguments you present in the following do not affect Mertz's
argument in the least, and thus cannot indicate why you don't find it
completely compelling. To rephrase David's argument: simplicity suggests
that a 'class' statement should always have the same fundamental semantics.
Since it SOMETIMES needs to bind a name, therefore, letting it ALWAYS
bind a name -- rather than sometimes yes, sometimes no, depending on
context -- is the only Pythonic approach.


I think the problem here is same as with understanding Python
variables and assignment. People think names as objects themself
rather than bindings to an object. (Maybe I should have said "things"
to avoid confusin. Though in Python, everything you can bind a name to
is a Python object.) They think that after

class foo: pass

you get

+-----+
| foo |
+-----+

but in Python in reality you get
+-----+
foo --> | |
+-----+

So, in Python all objects are basically anonymous. They just have
names bound to them. And since everything is an object, this goes for
classes and functions, too. I'm not sure about modules though.


Well, classes (and modules) are not really anonymous. A class knows
it's name (as does a module).
class foo: pass .... print foo __main__.foo bar = foo
print bar __main__.foo


so, a more accurate picture would be
+-----+
foo --> | foo |
+-----+
Jul 18 '05 #185
Kenny Tilton:
as when we go for the
productivity win of untyped variables and give up on bugs strong static
typing is supposed to find.


You do realize that strong typing and static typing are different
things?

What does (the lisp equivalent of) 2.5 + "a" do?

In Python, a strongly typed language, it raises an exception. I
consider that a good thing.

But Python is not statically typed.

Andrew
da***@dalkescientific.com
Jul 18 '05 #186
Kenny Tilton:
Lisp is sick. From the hyperspec on make-hash-table, the test for
lookups can be "eq, eql, equal, or equalp. The default is eql." EQUAL
would work for this case. EQL just looks at object identity.
Given the mutability in the structure, how is the hash key generated?
No, to match the power of my code you need to do:

(let ((b (make-instance 'box
:left 10
:right (c? (* 2 (left self)))))
(print (list (left b) (right b)))
Again, I don't know enough Lisp to follow what you are saying.
Download Paul Graham's On Lisp, he has a macro in there that hides all
the plumbing you used for cacheing. :) Look for "Memoize".
As I pointed elsewhere, I tried a couple times to learn Lisp,
if only to figure out how to tweak Emacs. I never succeeded.
Perhaps for the same reason I never liked HP's stack oriented
calculators? And yes, Dylan's another language on my 'should
learn more about' because of its in-fix approach to Lisp-style
programming.
The bestest I could do without macros would be:

(make-instance 'box
:left (c? (lambda (self)
(+ 2 (right a)))
....
Which implies Python's object model and Lisps are different,
because my code requires but one change per method, just
like your macro solution. While macros would be better
for this example, for Lisp, it isn't a good example of why a
macro system would improve Python's expressability.
The anti-macronistas are better off with the argument, hey, if you want
Lisp, use Lisp. Let's keep it simple here. the question for you all is
how far you want to take Python.


But that statement expresses a certain arrogance which grates
against at least my ears. The point I've made over and over is
that languages which optimize for a single person do not
necessarily optimize for a group of people, especially one
which is scattered around the world and over years. Given
that latter definition, Python is extraordinary advanced, even
further than Lisp is.

For observational evidence of this, I suggest my own
subfields, computational biology and computational chemisty.
In the first there are bioperl, biopython, biojava, and bioruby,
all with active participants and a yearly confererence organized
by open-bio.org. But there is only a rudimentary biolisp project
with minimal code available and just about no community
involvement. In the latter, Python takes the lead by far over
any language other than C/C++/Fortran with commercial support
for a couple toolkits and several more free ones beyond that.
There's even a workshop in a couple weeks on the representation
of biomolecules for Python. There are also some Java and C++
toolkits for chemical informatics. And again, there is no Lisp
involvement.

I ask you why. And I assert that it's because Lisp as a
language does not encourage the sort of code sharing that
the languages I mentioned above do. So while it is very
expressive for a single person, a single person can only
do so much.

Andrew
da***@dalkescientific.com
Jul 18 '05 #187
On Thu, 2003-08-21 at 12:19, Aahz wrote:
In article <ql****************@newsread4.news.pas.earthlink.n et>,
Andrew Dalke <ad****@mindspring.com> wrote:
Can we have a deprecation warning for that? I've never
seen it in any code I've reviewed.


We will, probably 2.4 or 2.5. (Whenever 3.0 starts getting off the
ground.)


Hmm... I still use <> exclusively for my code, and I wouldn't really
like it getting deprecated. At least for me, != is more difficult to see
when browsing source than <> is, as != has a striking similarity to ==,
at least at the first glance...

I know <> has been deprecated for long, but I'd much rather have both
syntaxes allowed, also for the future...

But anyway, I guess the way until it will be deprecated is still a long
one, so I won't bother to protest more now... :)

Heiko.
Jul 18 '05 #188
"Andrew Dalke" <ad****@mindspring.com> writes:
Kenny Tilton:
as when we go for the
productivity win of untyped variables and give up on bugs strong static
typing is supposed to find.
You do realize that strong typing and static typing are different
things?


Note that Kenny said "untyped _variables_" not "untyped objects" or
"untyped language".
What does (the lisp equivalent of) 2.5 + "a" do?
Common Lisp complains that "a" is not a number.
In Python, a strongly typed language, it raises an exception.


Common Lisp is strongly and dynamically typed, just like Python
.... although CL does allow type declarations, for optimization
purposes.
Jul 18 '05 #189
"Andrew Dalke" <ad****@mindspring.com> writes:
As I pointed elsewhere, I tried a couple times to learn Lisp,
if only to figure out how to tweak Emacs. I never succeeded.
Perhaps for the same reason I never liked HP's stack oriented
calculators?
Lisp is simple.

(<operator> <item-1> <item-2> ...)

Where's the problem?

Granted, you need an editor that helps you to match the parens (which
is nothing particularly esoteric), but other than that, it is just a
normal programming language. You've got functions, variables, loops,
etc. even goto, (and macros, and code generation/manipulation
facilities you have not seen nor will see anywhere else, but you do
not need to use them for simple stuff). What was it that you couldn't
understand?
For observational evidence of this, I suggest my own
subfields, computational biology and computational chemisty.
In the first there are bioperl, biopython, biojava, and bioruby,
all with active participants and a yearly confererence organized
by open-bio.org. But there is only a rudimentary biolisp project
with minimal code available and just about no community
involvement. In the latter, Python takes the lead by far over
any language other than C/C++/Fortran with commercial support
for a couple toolkits and several more free ones beyond that.
There's even a workshop in a couple weeks on the representation
of biomolecules for Python. There are also some Java and C++
toolkits for chemical informatics. And again, there is no Lisp
involvement.

I ask you why.
You shouldn't confuse success with quality. For experimental evidence
look at music charts. On the other hand, if people feel more
confortable with python, then so be it.

Lisp suffers also from historical problems. It was too big and too
slow for mid-80's up to mid-90's PCs, and there where far too many
incompatible dialects, which led to it falling in disgrace. But better
compilers, moores law, more memory, and an ANSI standard have improved
things up to a point where I think there is no reason not to use it.
And I assert that it's because Lisp as a language does not encourage
the sort of code sharing that the languages I mentioned above do.
This is ridiculous. You don't know Lisp so you do not have an idea
(hint: what you say is wrong), and thus you shouldn't be saying this.
So while it is very expressive for a single person, a single person
can only do so much.


People regularly work in teams on lisp projects. Is that just an
illusion of mine?

Jul 18 '05 #190
You wrote lots. Forgive me if I don't address everything. I wish I had
the time to address all your points more carefully.

"Andrew Dalke" <ad****@mindspring.com> writes:
However, at present, the only example I've seen for when to use a
macro came from a method cache implementation that I could implement
in Python using the normal class behaviour,
Or using directors, or lexical closures. Similarly in Lisp, there is
more than one way to do it, and many would not choose the macro option
when implementing a memoizer.
so I don't have a good idea of when macros would be appropriate
*for* *Python*.
(Looks like you've found a need for them yourself - see later on.)

Well, frankly, in the context of Python, I find the whole discussion a
bit abstract. Lisp macros fundamentally rely on the fact that Lisp
programs are represented as lists, and that Lisp includes excellent
support for manipulating such lists. The consequence of this (source
code being a form of data) is that it is extremely easy to manipulate
Lisp source code within Lisp. You can already achieve similar things
in Python by typing your source code as a string, and performing
string manipulations on it, and then calling eval ... but it is many
orders of magnitude more painful to do it this way.

Alternatively, get your hands on the parse-tree (I believe there's a
module to help), and mess around with that. That should be much easier
that playing with strings, but still much more of a pain than Lisp
macros.
When this topic has come up before, others mentioned how
macros would theoretically be able to, say, modify list.sort
to return the sorted list after it has been modified in-place.
You don't need macros for that. With the advent of new-style classes,
you subclass list, override the sort function, and replace
__builtins__.list (maybe there's more to it, but as this is not
something I ever intend to do, forgive me for not checking the
details.)
Given the not infrequent request for the feature, I know that
if it was allowed, then some of my clients would have done
that, making it harder for me to know if what I'm looking at
is core Python behaviour or modified.
That's not the point of macros. The point is not to modify existing
behaviour behind one's back. The point is to add new behaviour
.... much like it is with functions, classes etc.
What you say is true, but most of the code I look at is
based on fundamental Python types, from which I can
be assured of fixed behaviour, or classes and functions,
where I can be assured that they are free to do their
own thing. The patterns of behaviour are fixed and
the opportunities for change well defined.

Macros, as I understand it, blurs those lines.
I don't think this has anything to do with macros. This is a
consequence of a language allowing to re-bind built-ins. I remind you
that Python allows this, today.
I will grant that Lisp or Scheme is the end-all and be-all of
languages.
:-) Well, at least that's clear :-) :-)

[NB, I love Python, and wouldn't want to go without it, for a plethora
of reasons.]
Where's the language between those and Python?
I am not sure that there is a need for one; Python and Lisp are
already very close, as compared to other languages. But maybe some
interemediate language would serve a good purpose.
Is it possible to have a language which is more flexible than
Python but which doesn't encourage the various dialectization
historically evident in the Lisp/Scheme community?
Hmmm. I think this "dialectization of the Lisp/Scheme community" is a
bit like the "dialectization of the Algol Family community" or the
"dialectization of the functonal community" or the "dialectization of
the scripting (whatever that means) community"

The fundamental feature of Lisps is that they represent their source
code in a format which they themselves can manipulate easily. (Often
people try to characterize Lisps by the 4 or 5 fundamental operators
that you need to make all the rest, but I don't find this an
interesting perspective.) What's wrong with there being different
languages with this characteristic? What's wrong with there being
different functional languages? What's wrong with there being
different scripiting (whatever that means) languages ?

Maybe you refer to the fact that, should you wish to make a completely
new Lisp-like language, then starting with an already existing lisp,
and writing your first implementation in that (with the help of
macros), is usually by far the best way of going about it.

(This is exactly how Scheme was created, IIRC)

But your new language is exactly that. A new language. Faithful users
of the language in which you wrote that first implementation, will not
suddenly find that the language they know and love has been broken.
Could most macros also be written without macros, using
classes or lambdas?
Heh. You can write a lot of macros which don't need to be macros (so
don't). But there are some things for which macros are absolutely
necessary.

(Actually, in Lisp you could get by with functions and "'" (the quote)
.... but you'd still be writing macros, without official language
support for them.)

I guess that it boils down to delaying evaluation until you have had
the opportunity to manipulate your source.
How often are the benefits of macros
that much greater than classes&functions to merit their
inclusion.

Does it take skill to know when to use one over the other?
Often, it does. Some are no-brainers (Control structures, for example).
Do people use macros too often?
Some probably do. This is probably true of classes too.
When do they hinder misunderstanding?
When they are badly designed. This is true of classes too.
Are they more prone to misuse than classes&functions?
I guess it is generally true, that the more powerful and versatile the
tool, the more prone it is to misuse.
The complaint about macros has been their tendency to increase a
single person's abilities at the cost of overall loss in group
understanding.
No, no, NO, Noooooooooo ! :-)

At the top of your reply, you agreed about my point that abstracting a
frequently repeated pattern is preferable re-implementing it all over
your code. _This_ is what macros are about.

One can write useless and obfuscating functions and classes, just like
one can write useless and obfuscating macros.
I've heard references to projects where that didn't occur, but am
not swayed by it because those seem staffed by people with
extraordinarily good programming skills almost never found amoung
the chemists and biologists I work with.
I do not advocate Lisp as a language for people who are not prepared
to invest serious time to understanding it, which probably includes
most people whose primary activity is not programming.

That is why I think Python is _extremely_ useful and necessary. It
provides a significant portion of the power of Lisp, for a small
initial investment.

I do not expect CERN physicists to use Lisp, I do expect them to use
Python.
The point I'm trying to make is that different, very smart people
like "Lisp", but insist on variations.
Very smart and not-so-smart people like "scripting languages", but
insist on variations. There's Perl, Python, Ruby ...
There is clisp
Clisp is just one implementation of an ANSI standardized Lisp (Common Lisp).
and elisp
Elisp is the scripting language of Emacs.

Now, Common Lisp is an all-purpose stand-alone language designed for
constructing complicated systems; Elisp is designed for configuning
and extending Emacs.

Complaining that this is "insisting on variations of lisp", and that
it is somehow a BAD THING, is a bit like complaining about the
co-existence of Python and Occam, as "insisting on variations of
languages with significant indentation".
As I understand it, macros can be used to make one lisp variation
act like another.
This is true to some extent. But just because it can be done, doesn't
mean that very many people actually ever want to do it. Yes, sometimes
CLers want to fake up a Scheme-like continuation, and the language
allows them to do it. Great. But that (pretending to be another
already existing language) is not the point of macros.
A few days ago I tested out a C++ library. It didn't work on the C++
system I had handy because the STL implementation was
different/template support was different. etc. etc.


Did you really or are your making that up for the
sake of rhetoric?


Sorry, I should have made clear that I made it up for the sake of
rhetoric. However, the only thing that is untrue is the "A few days
ago" bit. It has happened repeatedly in the past.
If it takes more than four decades for different Lisp
implementations to agree on how to import a module, then I think
there's a problem.
Again, you are confusing "different implementations" with "different
languages".

Implementations of ANSI Common Lisp agree on, well, anything defined
within the standard.

Implementations of different languages clearly do not agree. This is
true of those in the Lisp family, just as it is for members of any
other family of Languages.
I needed to evalute a user-defined expression where the variable
names are computed based on calling an associated function. The
functions may take a long time to compute and most names are not
used in an expression, so I want to compute the names only when
used.
You want lazy evaluation ?
Did I start from scratch? No! I used Python to build the parse
tree
In Lisp, you start off with the parse tree. That's the great thing
about it.
then tweaked a few nodes of that tree to change the name lookup into
the right form, then generated the function from that parse tree.
You've just Greenspunned Lisp macros.

Manipulating the parse-tree is exactly what Lisp macros are about.
The syntax was the same as Python's, but the behaviour different.
In Lisp you would typically give a name to this behaviour, and then
you would be able to use it alongside the original language.

For example, if the user-defined expression is

(+ (foo 2) (bar a b c d))

The tweaked version would be

(lazy-eval (+ (foo 2) (bar a b c d)))

Writing a macro to make Lisp a uniformly lazy language (as you seem to
have been suggesting one might proceed, way up-post), is definitely
not the way to do it.
Though back at the Python level, it's a "call this function to get
the needed result",
In Lisp it's "call this macro to get the needed result".
and has precisely the same nature as any other Python function would
have.
And has percisely the same nature as any other Lisp macro would have.
And some languages are not-at-all close to Python. Eg, I wanted
to implement a domain-specific language called MCL
What, "Macintosh Common Lisp" ? :-)
using my PyDaylight package. I ended up writing a parser for MCL
and converting the result into Python code, then exec'ing the Python
code.
In CL this is done with reader macros; a means of altering the way the
parse tree is constructed from the source data.
Similarly, SMILES is a language for describing molecules. Why in the world would I want to extend Lisp/Perl/Python/
whatever to support that language directly?
a) Because it's easier to build it on top of Lisp than from scratch.

b) Because you never know what future requirements you might have.
It's an extreme form of Greenspunning.


??? Apparantly not Alan Greenspan.


Greenspun's Tenth Rule of Programming:

"Any sufficiently complicated C or Fortran program contains an
ad-hoc, informally-specified bug-ridden slow implementation of half
of Common Lisp."

(Of course, it's a general statement about developing in low-level
languages as compared to developing in higher-level ones.)
Please tell me how you would implement this language

load "abc.pdb" into a
select "resname LYS" from a into b
save b as "lysine.pdb"

as a macro in Lisp. I'll assume 'load-pdb' loads a PDB file into
a object which holds a set of atoms, and that object has the
method 'select-atoms' which creates a new (sub)set and also
has the method 'save-as' for saving those atoms in the right format.

And the above is all that the user can type.

How in the world does macros make handling that language
any easier than the standand parser-based non-macro solution?


I don't think I understand the true meaning of your question.
Anyway, your parse tree exapmle shows that you DO understand and use macros
.... you just don't know that that's the name of what you are doing :-)
Jul 18 '05 #191
Hans Nowak <ha**@zephyrfalcon.org> writes:
The problem is that a macro system that is too powerful can be harmful.
The problem is that a permissive language can be harmful.
Let's say you write a useful module. Python 3.6 just added a very
powerful macro system, and you use it to, say, write functions with
lazy evaluation, make strings mutable, and write your own flavor of
the for-loop. Now I cannot read your code anymore. A simple function
call, or a loop, does not mean what it used to mean.


Let's say you rebind all the attributes of __bulitins__. Now I cannot
read your code anymore (well, I can, but it doent'h do what it looks
like it will do).

If you deliberately want to break things, you don't need macros.

os.system("rm -rf *")

Just because a stupid or malicious programmer could do "bad things" is
not a reason to reduce a language's power. (You end up with Java.)
Jul 18 '05 #192
On Wed, 2003-08-20 at 14:15, Alex Martelli wrote:
Designing a good language is all about designing the right high level
abstractions. Even a medium skilled designer should be able to design
a language that maps better to their specific domain than a general


I entirely, utterly, totally and completely disagree with this point.

This is like saying that even a medium skilled musician should be
able to write music that plays better to their specific audience than
great music written by a genius who's never personally met any of
the people in the audience: it's just completely false. I want to
use a language designed by a genius, and I want to listen to music
written by Bach, Haendel, Mozart, and the like.


I was with you until this point. Your preference for music written by
geniuses only says that you are part of that audience for that type of
music; it says nothing about whether Mozart would play well to the
audience in a mosh pit. I don't think it invalidates your point about
programming languages, it's just a bad (in fact, incorrect) example.

Regards,

--
Cliff Wells, Software Engineer
Logiplex Corporation (www.logiplex.net)
(503) 978-6726 (800) 735-0555
Jul 18 '05 #193


Andrew Dalke wrote:
Kenny Tilton:
The anti-macronistas are better off with the argument, hey, if you want
Lisp, use Lisp. Let's keep it simple here. the question for you all is
how far you want to take Python.

But that statement expresses a certain arrogance which grates
against at least my ears.


Really? I thought that was non-controversial or I would not have said
it. Earlier in this thread I thought I read a Pythonista saying, without
contradiction, that Python does not try to be everything. Lisp macros
try to be everything. ie, If Lisp does not have something I want, such
as C-style enums because I am literally translating a C RoboCup client
to Lisp, noproblemo, I just cobble one together, and I can arrange it so
the final syntax:

(enum CMD_KICK CMD_DASH (CMD_TEAR_OFF_SHIRT 42)...)

is close enough to C syntax that the C can be edited into Lisp by
changing/moving braces to parens, deleteing all the commas, and
hand-hacking the rare symbol=number usage.

The point I've made over and over is that languages which optimize for a single person do not
necessarily optimize for a group of people, especially one
which is scattered around the world and over years. Given
that latter definition, Python is extraordinary advanced, even
further than Lisp is.

For observational evidence of this, I suggest my own
subfields, computational biology and computational chemisty.
In the first there are bioperl, biopython, biojava, and bioruby,
all with active participants and a yearly confererence organized
by open-bio.org.
You do not like? http://www.biolisp.org/

Give it time. It's a chicken-egg thing. In fact, my RoboCup project is a
conscious attempt to be the egg for a renaissance in RoboLisp. The
RoboCup server was originally a Lisp project, but teams are now almost
universally developed in Java or C++. The funny (and great for me!)
thing is that all those clients have to parse:

"(see 457 ((f l t 10) 12.3 23 2.3 5)..."

with parsers, where I just say: (read-from-string <msg>).

And I have already heard from one C or Java (they did not say) team that
is interested in my code base simply because it is Lisp.

We lispniks do look forward to the day when there is a larger Lisp
community, and we take great consolation from Python's success (and
Perl's and to a lesser degree Ruby's). That tells us there is great
unhappiness with Java and C++. It also shows that popularity and
dominance in computer languages may not be the advantage it seems to be
for OSes.

Lisp has been dead for twenty years, but some of us won't use anything
else (no macros!) and we even see a trickle of newbies on c.l.l., maybe
one a day. That is mighty small number compared to Python or Ruby, but
we used to see one a year. I got so curious I started a survey on cliki
(turns out Paul Graham gets a lot of credit):

http://www.cliki.net/The%20RtLS%20by%20Road

I was trying to find out how people ended up trying Lisp in spite of its
tiny community and death.
I ask you why.


Historical, for one. Lisp has always needed a meg or two of RAM. Even
when PCs had 8k, 64k, 128k... so C and Pascal became the languages of
the masses. Remember Pascal? <g> C++ was C trying to hop on the OO
bandwagon (don't get me wrong, I like OO) and Java hopped on the
internet bandwagon (and stayed close to C syntax to win those folks
over. Python wins because of its interactive qualty and seamless access
to C, and by adopting many cool ideas from more advanced languages--kind
of a best of both worlds.

Peter Norvig, a lisp biggy, has famously found Python to be equivalent
to Lisp (except for runtime speed, IIRC). Think of Lisp as compiled
Python. With macros and multi-methods etc etc. But you still have to
roll FFI binidngs to access C. I did that for my OPenGL project, using
macros which took slightly edited C headers and transformed them into
the FFI declarations.

The other thing Lisp did not have was cheap graphical workstations with
syntax-aware editors and optimizing compilers. So that got it off on the
wrong foot. But I look at it this way. Lisp already was a fad language
(when folks were all excited about AI) and has already died. But it is
still the language of choice for some very talented programmers who know
all about the other languages out there, andit is picking up a trickle
of interest. And when I look at languages like Perl and Python, I see
them adopting new features highly reminiscent of Lisp.

I ask you why? :)
--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #194
JCM
Heiko Wundram <he*****@ceosg.de> wrote:
Hmm... I still use <> exclusively for my code, and I wouldn't really
like it getting deprecated. At least for me, != is more difficult to see
when browsing source than <> is, as != has a striking similarity to ==,
at least at the first glance... I know <> has been deprecated for long, but I'd much rather have both
syntaxes allowed, also for the future...


How about we keep them both, but make <> mean greater-than-or-less-than,
which is a different comparison than != on partially ordered sets.

1 > 2 -- false
1 < 2 -- true
1 != 2 -- true
1 <> 2 -- true

1 > 'x' -- false
1 < 'x' -- false
1 != 'x' -- true
1 <> 'x' -- false

Ok, I'm joking. Mostly.
Jul 18 '05 #195
Hans Nowak <ha**@zephyrfalcon.org> writes:
Jacek Generowicz wrote:

[re-binding __builtins__ considered harmful]

I see your point. The obvious answer would be "so don't do that,
then". Of course, the same answer would apply to writing abusive
macros.
Exactly.
Python usually strikes a good balance between being permissive and
restrictive.


In your opinion. The opinions of others, as to where a "good balance"
lies, will be different. And that's just fine. That's why we have
different languages.

I happen to agree that Python strikes a good balance. I also think
that Common Lisp strikes a good balance. The position of "good
balance" is a function of the language's audeince.
Just because a stupid or malicious programmer could do "bad things" is
not a reason to reduce a language's power. (You end up with Java.)


You are right, but one could wonder if the drawbacks don't outweight
the benefits. Python is already powerful as it is (compared to
languages other than Lisp ;-). I'm not sure if powerful macros would
do much good.


Well, with an interactive language[*] you can easily get two types of
programmers: the providers, and the users. A good macro facility
allows the provider types to create very useful abstractions for the
user types. The user types can use those abstractions without having
the faintest clue that they are built using macros, and can happily
program in the language without ever writing one of their own. (A bit
like metaclasses.)

Just because there are macros or metaclasses in a language, does not
mean that everyone needs to know about them, while everyone can
benefit from them.
[*] I guess that this is not exclusive to interactive languages, but I
suspect it's more marked.

But, wrt macros in Python, I don't really see what Python macros would
look like. Lisp code is represented in a fundamental, hierarchical,
easily transformable built-in data type. This ain't true for Python.

Sometimes I think "if Python had Lisp-like macros, what I am trying to
achieve right now would be so much easier". This doesn't mean that I
would necessarily advocate the inclusion of macros in Python. What I
object to, is the suggestion (which has been made repeatedly around
the thread) that macros are evil and should be avoided, and that
macros are responsible for the fragmentation/death of Lisp.

Macros are great. If a good, Pythonic macro system could be invented,
I am sure that it could be put to great use, and that it would not
engender the death of Python. I am not sure that such a system can be
invented. I do not suggest that such a system _should_ be added to
Python.
Jul 18 '05 #196
Kenny Tilton:
as when we go for the
productivity win of untyped variables and give up on bugs strong static typing is supposed to find.

Jacek Generowicz: Note that Kenny said "untyped _variables_" not "untyped objects" or
"untyped language".


Ahh, I hadn't caught that.

I did know Lisp had strong dynamic typing with optional static typing,
which was why I was surprised he mentioned it. I still don't understand
why he said it given that Python similarly meets the quoted statement.

Andrew
da***@dalkescientific.com
Jul 18 '05 #197


Andrew Dalke wrote:
Kenny Tilton:
as when we go for the
productivity win of untyped variables and give up on bugs strong
static
typing is supposed to find.


Jacek Generowicz:
Note that Kenny said "untyped _variables_" not "untyped objects" or
"untyped language".

Ahh, I hadn't caught that.

I did know Lisp had strong dynamic typing with optional static typing,
which was why I was surprised he mentioned it. I still don't understand
why he said it given that Python similarly meets the quoted statement.


?? I was just giving another example of how lisp errs on the side of
letting us shoot ourselves in the foot. I was not saying anything about
Python.

--

kenny tilton
clinisys, inc
http://www.tilton-technology.com/
---------------------------------------------------------------
"Career highlights? I had two. I got an intentional walk from
Sandy Koufax and I got out of a rundown against the Mets."
-- Bob Uecker

Jul 18 '05 #198
Mario S. Mommer:
Lisp is simple.

(<operator> <item-1> <item-2> ...)

Where's the problem?
Quantum mechanics is simple. It's just H | psi> = E | psi>

What's H mean? And psi? And bra-ket notation?

The reason my original forays into Lisp around 1990 failed was
because the documentation I had for learning it didn't talk about
anything I found interesting, like doing file I/O and graphics. It
went on-and-on about data structures and manipulating them.

My later attempts were by trying to customize my Emacs LISP
setup, and getting various errors because I didn't know the
quoting rules well enough.

Since then it's been limited to incidental learning, like seeing examples
and trying to figure them out. Eg, consider Gabriel's "Worse is
Better" essay. The section "Good Lisp Programming is Hard"
http://www.ai.mit.edu/docs/articles/...on3.2.2.4.html
gives this example

(defun make-matrix (n m)
(let ((matrix ()))
(dotimes (i n matrix)
(push (make-list m) matrix))))

(defun add-matrix (m1 m2)
(let ((l1 (length m1))
(l2 (length m2)))
(let ((matrix (make-matrix l1 l2)))
(dotimes (i l1 matrix)
(dotimes (j l2)
(setf (nth i (nth j matrix))
(+ (nth i (nth j m1))
(nth i (nth j m2)))))))))

and says the above is both "absolutely beautiful, but it adds
matrices slowly. Therefore it is excellent prototype code and
lousy production code."

I've done a lot of matrix math, but even now cannot simply
look at the above code to figure out what it does, much less
does wrong. But I can figure them out from, say, Java,
in which I have no programming experience at all.

I've also looked at Lisp code when evaluating a package
written in Lisp. I was able to figure that library better than
the above, partially because I had already looked at 15 other
packages which did the same task, so knew the structure
of the solution. I turned out that the Lisp code was no more
powerful or flexible than the ones written in the other languages.
I didn't see good reasons to use Lisp for the problems in my
domain.
You shouldn't confuse success with quality. For experimental evidence
look at music charts. On the other hand, if people feel more
confortable with python, then so be it.


Quips are easy: "You shouldn't confuse power with quality."
And I assert that it's because Lisp as a language does not encourage
the sort of code sharing that the languages I mentioned above do.


This is ridiculous. You don't know Lisp so you do not have an idea
(hint: what you say is wrong), and thus you shouldn't be saying this.


Not being able to program in Lisp doesn't mean I don't know anything
about it. I've read the histories, the articles like Gabriel's, listened
to others as they talk about their experiences with Lisp, and decisions
to use an alternate language. I understand the decision to emphasize
its parse tree based approach over a more syntax oriented one. And
as you say, the semantics in Lisp are for the most part shared with
other languages.
So while it is very expressive for a single person, a single person
can only do so much.


People regularly work in teams on lisp projects. Is that just an
illusion of mine?


No. My take on things is that the people who do use Lisp these days
are self-selected for those who either 1) do things alone, ie, can and
will build everything from scratch, or 2) work hard at making sure
their code can be used by others. By "work hard" I mean smart
people willing to focus on the community and learn about the 7
or so different ways of saying 'equals' and the importants of closures
and all the other things needed to become good Lisp programmers.

The people I know focus mostly on developing new computational
techniques and only want to implement that part of the code, pulling
in code from elsewhere if possible. Hence, not #1 nor #2. (There
are some #1 projects, and the #2 people are almost invariable from
CS, learning Lisp first and biology second.)

In other words, the overlap between the types of people who are
good at programming in Lisp and those who decide upon a career
in computational life sciences is low.

Another possibilty I'm considering now is that the additional
flexibility that Lisp has, eg, for making frameworks, isn't needed
for the programming problems faced in this field, which for the
most part are either data munging or compute-bound algorithmics.
When then choose a more flexible language of a lesser one is
easier to learn and use? But a real Lisper would, I think, argue
that it means those tools are available for the few times it is needed.

Andrew
da***@dalkescientific.com
Jul 18 '05 #199
Alexander Schmolck <a.********@gmx.net> wrote in message
To recap: usually, if I change a class I'd like all pre-existing
instances to become updated ...... AFAIK doing this in a
general and painfree fashion is pretty much impossible in python


If I understand correctly what you want, then it is possible and
relatively painfree in Python. Please let me know if the following
snippet is ok.

#
# example of replacing a class in Python
#

import weakref

class InstanceAware:
dict = weakref.WeakValueDictionary()
def __init__(self):
self.__class__.dict[id(self)] = self
def __str__(self):
return 'Hello from %s class instance with id=%s' %
(self.__class__.__name__, id(self))

def replace_class(OldClass, NewClass):
olddict = OldClass.dict
newdict = NewClass.dict
for instance_id in olddict.keys():
instance = olddict[instance_id]
newdict[instance_id] = instance
instance.__class__ = NewClass
del OldClass

class A(InstanceAware):
dict = weakref.WeakValueDictionary()

class B(InstanceAware):
dict = weakref.WeakValueDictionary()

x = A()
y = A()
z = A()

print '--- Class A has %d instances' % len(A.dict)
print x
print y
print z

del z
replace_class(A, B)

print '--- Class B has %d instances' % len(B.dict)
print x
print y

output:

--- Class A has 3 instances
Hello from A class instance with id=12072832
Hello from A class instance with id=12177240
Hello from A class instance with id=15770208
--- Class B has 2 instances
Hello from B class instance with id=12072832
Hello from B class instance with id=12177240

regards,

Hung Jung
Jul 18 '05 #200

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

54
by: Brandon J. Van Every | last post by:
I'm realizing I didn't frame my question well. What's ***TOTALLY COMPELLING*** about Ruby over Python? What makes you jump up in your chair and scream "Wow! Ruby has *that*? That is SO...
226
by: Stephen C. Waterbury | last post by:
This seems like it ought to work, according to the description of reduce(), but it doesn't. Is this a bug, or am I missing something? Python 2.3.2 (#1, Oct 20 2003, 01:04:35) on linux2 Type...
28
by: john_sips_tea | last post by:
Just tried Ruby over the past two days. I won't bore you with the reasons I didn't like it, however one thing really struck me about it that I think we (the Python community) can learn from. ...
34
by: emrahayanoglu | last post by:
Hello Everyone, Now, I'm working on a new web framework. I tried many test on the other programming languages. Then i decided to use python on my web framework project. Now i want to listen...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.