By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
428,828 Members | 1,862 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 428,828 IT Pros & Developers. It's quick & easy.

A critic of Guido's blog on Python's lambda

P: n/a
Python, Lambda, and Guido van Rossum

Xah Lee, 2006-05-05

In this post, i'd like to deconstruct one of Guido's recent blog about
lambda in Python.

In Guido's blog written in 2006-02-10 at
http://www.artima.com/weblogs/viewpo...?thread=147358

is first of all, the title “Language Design Is Not Just Solving
Puzzles”. In the outset, and in between the lines, we are told that
“I'm the supreme intellect, and I created Python”.

This seems impressive, except that the tech geekers due to their
ignorance of sociology as well as lack of analytic abilities of the
mathematician, do not know that creating a language is a act that
requires little qualifications. However, creating a language that is
used by a lot people takes considerable skill, and a big part of that
skill is salesmanship. Guido seems to have done it well and seems to
continue selling it well, where, he can put up a title of belittlement
and get away with it too.

Gaudy title aside, let's look at the content of his say. If you peruse
the 700 words, you'll find that it amounts to that Guido does not like
the suggested lambda fix due to its multi-line nature, and says that he
don't think there could possibly be any proposal he'll like. The
reason? Not much! Zen is bantered about, mathematician's impractical
ways is waved, undefinable qualities are given, human's right brain is
mentioned for support (neuroscience!), Rube Goldberg contrivance
phraseology is thrown, and coolness of Google Inc is reminded for the
tech geekers (in juxtaposition of a big notice that Guido works
there.).

If you are serious, doesn't this writing sounds bigger than its
content? Look at the gorgeous ending: “This is also the reason why
Python will never have continuations, and even why I'm uninterested in
optimizing tail recursion. But that's for another installment.”. This
benevolent geeker is gonna give us another INSTALLMENT!

There is a computer language leader by the name of Larry Wall, who said
that “The three chief virtues of a programmer are: Laziness,
Impatience and Hubris” among quite a lot of other ingenious
outpourings. It seems to me, the more i learn about Python and its
leader, the more similarities i see.

So Guido, i understand that selling oneself is a inherent and necessary
part of being a human animal. But i think the lesser beings should be
educated enough to know that fact. So that when minions follow a
leader, they have a clear understanding of why and what.

----

Regarding the lambda in Python situation... conceivably you are right
that Python lambda is perhaps at best left as it is crippled, or even
eliminated. However, this is what i want: I want Python literatures,
and also in Wikipedia, to cease and desist stating that Python supports
functional programing. (this is not necessarily a bad publicity) And, I
want the Perl literatures to cease and desist saying they support OOP.
But that's for another installment.

----
This post is archived at:
http://xahlee.org/UnixResource_dir/w...bda_guido.html

* * Xah
* * xa*@xahlee.org
http://xahlee.org/

May 6 '06 #1
Share this Question
Share on Google+
267 Replies


P: n/a
"Xah Lee" <xa*@xahlee.org> writes:
Python, Lambda, and Guido van Rossum
Which one is the "critic"? Or is your subject field an indication that
you continue not to learn from responses to your previous posts?
is first of all, the title “Language Design Is Not Just Solving
Puzzles”. In the outset, and in between the lines, we are told that
“I'm the supreme intellect, and I created Python”.


Would that all of the ramblings in your posts was in between the lines.

--
\ "He may look like an idiot and talk like an idiot but don't let |
`\ that fool you. He really is an idiot." -- Groucho Marx |
_o__) |
Ben Finney

May 6 '06 #2

P: n/a


Xah Lee wrote:
Python, Lambda, and Guido van Rossum

Xah Lee, 2006-05-05

In this post, i'd like to deconstruct one of Guido's recent blog about
lambda in Python.

In Guido's blog written in 2006-02-10 at
http://www.artima.com/weblogs/viewpo...?thread=147358

is first of all, the title “Language Design Is Not Just Solving
Puzzles”. In the outset, and in between the lines, we are told that
“I'm the supreme intellect, and I created Python”.

This seems impressive, except that the tech geekers due to their
ignorance of sociology as well as lack of analytic abilities of the
mathematician, do not know that creating a language is a act that
requires little qualifications. However, creating a language that is
used by a lot people takes considerable skill, and a big part of that
skill is salesmanship. Guido seems to have done it well and seems to
continue selling it well, where, he can put up a title of belittlement
and get away with it too.

Gaudy title aside, let's look at the content of his say. If you peruse
the 700 words, you'll find that it amounts to that Guido does not like
the suggested lambda fix due to its multi-line nature, and says that he
don't think there could possibly be any proposal he'll like. The
reason? Not much! Zen is bantered about, mathematician's impractical
ways is waved, undefinable qualities are given, human's right brain is
mentioned for support (neuroscience!), Rube Goldberg contrivance
phraseology is thrown,
I think this is what you missed in your deconstruction. The upshot of
what he wrote is that it would be really hard to make semantically
meaningful indentation work with lambda. Guido did not mean it, but the
Rube Goldberg slam is actually against indentation as syntax. "Yes,
print statements in a while loop would be helpful, but..." it would be
so hard, let's go shopping. ie, GvR and Python have hit a ceiling.

That's OK, it was never meant to be anything more than a scripting
language anyway.

But the key in the whole thread is simply that indentation will not
scale. Nor will Python.
and coolness of Google Inc is reminded for the
tech geekers (in juxtaposition of a big notice that Guido works
there.).

If you are serious, doesn't this writing sounds bigger than its
content? Look at the gorgeous ending: “This is also the reason why
Python will never have continuations, and even why I'm uninterested in
optimizing tail recursion. But that's for another installment.”. This
benevolent geeker is gonna give us another INSTALLMENT!

There is a computer language leader by the name of Larry Wall, who said
that “The three chief virtues of a programmer are: Laziness,
Impatience and Hubris” among quite a lot of other ingenious
outpourings. It seems to me, the more i learn about Python and its
leader, the more similarities i see.

So Guido, i understand that selling oneself is a inherent and necessary
part of being a human animal. But i think the lesser beings should be
educated enough to know that fact. So that when minions follow a
leader, they have a clear understanding of why and what.


Oh, my, you are preaching to the herd (?!) of lemmings?! Please tell me
you are aware that lemmings do not have ears. You should just do Lisp
all day and add to the open source libraries to speed Lisp's ascendance.
The lemmings will be liberated the day Wired puts John McCarthy on the
cover, and not a day sooner anyway.

kenny (wondering what to call a flock (?!) of lemmings)

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #3

P: n/a
Ken Tilton <ke*******@gmail.com> wrote:
...
But the key in the whole thread is simply that indentation will not
scale. Nor will Python.


Absolutely. That's why firms who are interested in building *seriously*
large scale systems, like my employer (and supplier of your free mail
account), would never, EVER use Python, nor employ in prominent
positions such people as the language's inventor and BDFL, the author of
the most used checking tool for it, and the author of the best-selling
reference book about that language; and, for that matter, a Director of
Search Quality who, while personally a world-renowned expert of AI and
LISP, is on record as supporting Python very strongly, and publically
stating its importance to said employer.

Obviously will not scale. Never.

Well... hardly ever!
Alex
May 6 '06 #4

P: n/a
I V
On Fri, 05 May 2006 17:26:26 -0700, Xah Lee wrote:
Regarding the lambda in Python situation... conceivably you are right
that Python lambda is perhaps at best left as it is crippled, or even
eliminated. However, this is what i want: I want Python literatures,
and also in Wikipedia, to cease and desist stating that Python supports
functional programing. (this is not necessarily a bad publicity) And, I


What does lambda have to do with supporting or not supporting functional
programming?

May 6 '06 #5

P: n/a
Ken Tilton wrote:
[...] The upshot of what [Guido] wrote is that it would be really hard to make
semantically meaningful indentation work with lambda.


Haskell manages it.

--
David Hopwood <da******************@blueyonder.co.uk>
May 6 '06 #6

P: n/a


Alex Martelli wrote:
Ken Tilton <ke*******@gmail.com> wrote:
...
But the key in the whole thread is simply that indentation will not
scale. Nor will Python.

Absolutely. That's why firms who are interested in building *seriously*
large scale systems, like my employer (and supplier of your free mail
account), would never, EVER use Python, nor employ in prominent
positions such people as the language's inventor and BDFL, the author of
the most used checking tool for it, and the author of the best-selling
reference book about that language; and, for that matter, a Director of
Search Quality who, while personally a world-renowned expert of AI and
LISP, is on record as supporting Python very strongly, and publically
stating its importance to said employer.

Obviously will not scale. Never.

Well... hardly ever!


You are talking about being incredibly popular. I was talking about
language expressivity. COBOL in its day was incredibly popular and
certainly the language of choice (hell, the only language) for the
biggest corporations you can imagine. But it did not scale as a
language. I hope there are no doubts on that score (and I actually am a
huge fan of COBOL).

The problem for Python is its success. meant to be a KISS scripting
language, it has caught on so well that people are asking it to be a
full-blown, OO, GC, reflexive, yada, yada, yada language. Tough to do
when all you wanted to be when you grew up was a scripting language.

kenny (who is old enough to have seen many a language come and go)

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #7

P: n/a


David Hopwood wrote:
Ken Tilton wrote:
[...] The upshot of what [Guido] wrote is that it would be really hard to make
semantically meaningful indentation work with lambda.

Haskell manages it.


To be honest, I was having a hard time imagining precisely how
indentation broke down because of lambda. does text just sail out to the
right too fast?

kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #8

P: n/a
On Fri, May 05, 2006 at 05:26:26PM -0700, Xah Lee wrote:
Python, Lambda, and Guido van Rossum

[snip]

Foxtrot Oscar Alpha Delta

Others have said banning this troll would be wrong or undemocratic
but let's be sane: he has wasted hundreds of hours of other people's
time and hurt newbies especially as they don't know well enough to
ignore him.

Killfile him at the source. Heck, filter every post with a
[his domain omitted] domain in the body and make him spend an extra
$10 to buy another domain before crossposting garbage. No one else
in the world would links to him, so it is a safe bet.

-Jack

May 6 '06 #9

P: n/a
On Fri, 05 May 2006 21:16:50 -0400, Ken Tilton wrote:
The upshot of
what he wrote is that it would be really hard to make semantically
meaningful indentation work with lambda.
Pretty much correct. The complete thought was that it would be painful
all out of proportion to the benefit.

See, you don't need multi-line lambda, because you can do this:
def make_adder(x):
def adder_func(y):
sum = x + y
return sum
return adder_func

add5 = make_adder(5)
add7 = make_adder(7)

print add5(1) # prints 6
print add5(10) # prints 15
print add7(1) # prints 8
Note that make_adder() doesn't use lambda, and yet it makes a custom
function with more than one line. Indented, even.

You could also do this:
lst = [] # create empty list
def f(x):
return x + 5
lst.append(f)
del(f) # now that the function ref is in the list, clean up temp name

print lst[0](1) # prints 6
Is this as convenient as the lambda case?

lst.append(lambda x: x + 7)
print lst[1](1) # prints 8
No; lambda is a bit more convenient. But this doesn't seem like a very
big issue worth a flame war. If GvR says multi-line lambda would make
the lexer more complicated and he doesn't think it's worth all the effort,
I don't see any need to argue about it.
But the key in the whole thread is simply that indentation will not
scale. Nor will Python.


This is a curious statement, given that Python is famous for scaling well.

I won't say more, since Alex Martelli already pointed out that Google is
doing big things with Python and it seems to scale well for them.
--
Steve R. Hastings "Vita est"
st***@hastings.org http://www.blarg.net/~steveha

May 6 '06 #10

P: n/a

"I V" <wr******@gmail.com> wrote in message
news:pa***************************@gmail.com...
On Fri, 05 May 2006 17:26:26 -0700, Xah Lee wrote:
Regarding the lambda in Python situation... conceivably you are right
that Python lambda is perhaps at best left as it is crippled, or even
eliminated. However, this is what i want: I want Python literatures,
and also in Wikipedia, to cease and desist stating that Python supports
functional programing. (this is not necessarily a bad publicity) And, I


What does lambda have to do with supporting or not supporting functional
programming?


What does any of this have to do with Java?

--
Rhino
May 6 '06 #11

P: n/a


Steve R. Hastings wrote:
On Fri, 05 May 2006 21:16:50 -0400, Ken Tilton wrote:
The upshot of
what he wrote is that it would be really hard to make semantically
meaningful indentation work with lambda.

Pretty much correct. The complete thought was that it would be painful
all out of proportion to the benefit.

See, you don't need multi-line lambda, because you can do this:
def make_adder(x):
def adder_func(y):
sum = x + y
return sum
return adder_func

add5 = make_adder(5)
add7 = make_adder(7)

print add5(1) # prints 6
print add5(10) # prints 15
print add7(1) # prints 8
Note that make_adder() doesn't use lambda, and yet it makes a custom
function with more than one line. Indented, even.

You could also do this:
lst = [] # create empty list
def f(x):
return x + 5
lst.append(f)
del(f) # now that the function ref is in the list, clean up temp name

print lst[0](1) # prints 6
Is this as convenient as the lambda case?

lst.append(lambda x: x + 7)
print lst[1](1) # prints 8
No; lambda is a bit more convenient. But this doesn't seem like a very
big issue worth a flame war.


<g> Hopefully it can be a big issue and still not justify a flame war.

Mileages will always vary, but one reason for lambda is precisely not to
have to stop, go make a new function for this one very specific use,
come back and use it as the one lambda statement, or in C have an
address to pass. but, hey, what are editors for? :)

the bigger issue is the ability of a lambda to close over arbitrary
lexically visible variables. this is something the separate function
cannot see, so one has to have a function parameter for everything.

but is such lexical scoping even on the table when Ptyhon's lambda comes
up for periodic review?
If GvR says multi-line lambda would make
the lexer more complicated and he doesn't think it's worth all the effort,
I don't see any need to argue about it.


Oh, no, this is just front porch rocking chair BS. But as an enthuiastic
developer I am sensitive to how design choices express themselves in
ways unanticipated. Did the neat idea of indentation-sensitivity doom
pythonistas to a life without the sour grapes of lambda?

If so, Xah's critique missed that issue and was unfair to GvR in
ascribing his resistance to multi-statement lamda to mere BDFLism.

kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #12

P: n/a
Reported for excessive crossposting.

--
John Bokma Freelance software developer
&
Experienced Perl programmer: http://castleamber.com/
May 6 '06 #13

P: n/a
"Rhino" <no***********************@nospam.com> wrote:
What does any of this have to do with Java?


Xah Lee is well known for abusing Usenet for quite some time, report
his posts as excessive crossposts to:

abuse at bcglobal dot net
abuse at dreamhost dot com

IIRC this is his third ISP account in 2 weeks, so it *does* work.

Moreover, his current hosting provider, dreamhost, might drop him soon.

--
John Bokma Freelance software developer
&
Experienced Perl programmer: http://castleamber.com/
May 6 '06 #14

P: n/a
Ken Tilton <ke*******@gmail.com> writes:
kenny (wondering what to call a flock (?!) of lemmings)


Couldn't find it here:-

http://ojohaven.com/collectives/

So I would propose a "leap" of lemmings :-)
WAY OT! Sorry.

atb



Glyn
May 6 '06 #15

P: n/a
Ken Tilton <ke*******@gmail.com> wrote:
...
Absolutely. That's why firms who are interested in building *seriously*
large scale systems, like my employer (and supplier of your free mail ... Obviously will not scale. Never.

Well... hardly ever!
You are talking about being incredibly popular. I was talking about


Who, me? I'm talking about the deliberate, eyes-wide-open choice by
*ONE* firm -- one which happens to more or less *redefine* what "large
scale" computation *means*, along many axes. That's got nothing to do
with Python being "incredibly popular": it has everything to do with
scalability -- the choice was made in the late '90s (and, incidentally,
by people quite familiar with lisp... no less than the reddit.com guys,
you know, the ones who recently chose to rewrite their side from Lisp to
Python...?), based on scalability issues, definitely not "popularity"
(Python in the late '90s was a very obscure, little-known language).
kenny (who is old enough to have seen many a language come and go)


See your "many a language" and raise you one penny -- besides sundry
Basic dialects, machine languages, and microcode[s], I started out with
Fortran IV and APL, and I have professionally programmed in Pascal (many
dialects), Rexx, Forth, PL/I, Cobol, Lisp before there was a "Common"
one, Prolog, Scheme, Icon, Tcl, Awk, EDL, and several proprietary 3rd
and 4th generation languages -- as well of course as C and its
descendants such as C++ and Java, and Perl. Many other languages I've
studied and played with, I've never programmed _professionally_ (i.e.,
been paid for programs in those languages), but I've written enough
"toy" programs to get some feeling for (Ruby, SML, O'CAML, Haskell,
Snobol, FP/1, Applescript, C#, Javascript, Erlang, Mozart, ...).

Out of all languages I know, I've deliberately chosen to specialize in
Python, *because it scales better* (yes, functional programming is
_conceptually_ perfect, but one can never find sufficiently large teams
of people with the right highly-abstract mathematical mindset and at the
same time with sufficiently down-to-earth pragmaticity -- so, for _real
world_ uses, Python scales better). When I was unable to convince top
management, at the firm at which I was the top programmer, that the firm
should move to Python (beyond the pilot projects which I led and gave
such stellar results), I quit, and for years I made a great living as a
freelance consultant (mostly in Python -- once in a while, a touch of
Pyrex, C or C++ as a vigorish;-).

That's how come I ended up working at the firm supplying your free mail
(as Uber Tech Lead) -- they reached across an ocean to lure me to move
from my native Italy to California, and my proven excellence in Python
was their prime motive. The terms of their offer were just too
incredible to pass by... so, I rapidly got my O1 visa ("alien of
exceptional skills"), and here I am, happily ubertechleading... and
enjoying Python and its incredibly good scalability every single day!
Alex
May 6 '06 #16

P: n/a
Steve R. Hastings <st***@hastings.org> wrote:
...
But the key in the whole thread is simply that indentation will not
scale. Nor will Python.
This is a curious statement, given that Python is famous for scaling well.


I think "ridiculous" is a better characterization than "curious", even
if you're seriously into understatement.

I won't say more, since Alex Martelli already pointed out that Google is
doing big things with Python and it seems to scale well for them.


And of course we're not the only ones. In fact, I believe that we're
not even among the firms which have reported their experiences in the
official "Python Success Stories" -- IBM, Industrial Light and Magic,
NASA, etc, etc, are there, but we arent. I guess we just prefer to play
our cards closer to our chest -- after all, if our competitors choose to
use inferior languages, it's hardly to our advantage to change that;-).
Alex
May 6 '06 #17

P: n/a
Ken Tilton wrote:
Oh, my, you are preaching to the herd (?!) of lemmings?! Please tell me
you are aware that lemmings do not have ears. You should just do Lisp
all day and add to the open source libraries to speed Lisp's ascendance.
The lemmings will be liberated the day Wired puts John McCarthy on the
cover, and not a day sooner anyway.


And then the 12th vanished Lisper returns and Lispers are not
suppressed anymore and won't be loosers forever. The world will be
united in the name of Lisp and Lispers will be leaders and honorables.
People stop worrying about Lispers as psychpaths and do not consider
them as zealots, equipped with the character of suicide bombers. No,
Lisp means peace and paradise.

May 6 '06 #18

P: n/a
"Kay Schluehr" <ka**********@gmx.net> writes:
And then the 12th vanished Lisper returns and Lispers are not
suppressed anymore and won't be loosers forever. The world will be
The mark of a true loser is the inability to spell 'loser.' Zing!
them as zealots, equipped with the character of suicide bombers. No,


A very reasonable comparison. Yes, the more I think about it, we Lisp
programmers are a lot like suicide bombers.

Doofus.

--
This is a song that took me ten years to live and two years to write.
- Bob Dylan
May 6 '06 #19

P: n/a
Ken Tilton <ke*******@gmail.com> writes:
<g> Hopefully it can be a big issue and still not justify a flame war.

Mileages will always vary, but one reason for lambda is precisely not
to have to stop, go make a new function for this one very specific
use, come back and use it as the one lambda statement, or in C have an
address to pass. but, hey, what are editors for? :)

the bigger issue is the ability of a lambda to close over arbitrary
lexically visible variables. this is something the separate function
cannot see, so one has to have a function parameter for everything.

but is such lexical scoping even on the table when Ptyhon's lambda
comes up for periodic review?


This is second-hand, as I don't actually follow Python closely, but
from what I've heard, they now have reasonable scoping rules (or maybe
they're about to, I'm not sure). And you can use def as a
Scheme-style inner define, so it's essentially a LABELS that gets the
indentation wrong. This means they have proper closures, just not
anonymous ones. And an egregiously misnamed lambda that should be
fixed or thrown out.

If Python gets proper macros it won't matter one bit that they only
have named closures, since you can macro that away in a blink of an
eye.
May 6 '06 #20

P: n/a
al*****@yahoo.com (Alex Martelli) writes:
Ken Tilton <ke*******@gmail.com> wrote:
...
> Absolutely. That's why firms who are interested in building *seriously*
> large scale systems, like my employer (and supplier of your free mail ... > Obviously will not scale. Never.
>
> Well... hardly ever!


You are talking about being incredibly popular. I was talking about


Who, me? I'm talking about the deliberate, eyes-wide-open choice by
*ONE* firm -- one which happens to more or less *redefine* what "large
scale" computation *means*, along many axes. That's got nothing to do
with Python being "incredibly popular": it has everything to do with
scalability -- the choice was made in the late '90s (and, incidentally,
by people quite familiar with lisp... no less than the reddit.com guys,
you know, the ones who recently chose to rewrite their side from Lisp to
Python...?), based on scalability issues, definitely not "popularity"
(Python in the late '90s was a very obscure, little-known language).
kenny (who is old enough to have seen many a language come and go)


See your "many a language" and raise you one penny -- besides sundry
Basic dialects, machine languages, and microcode[s], I started out with
Fortran IV and APL, and I have professionally programmed in Pascal (many
dialects), Rexx, Forth, PL/I, Cobol, Lisp before there was a "Common"
one, Prolog, Scheme, Icon, Tcl, Awk, EDL, and several proprietary 3rd
and 4th generation languages -- as well of course as C and its
descendants such as C++ and Java, and Perl. Many other languages I've
studied and played with, I've never programmed _professionally_ (i.e.,
been paid for programs in those languages), but I've written enough
"toy" programs to get some feeling for (Ruby, SML, O'CAML, Haskell,
Snobol, FP/1, Applescript, C#, Javascript, Erlang, Mozart, ...).

Out of all languages I know, I've deliberately chosen to specialize in
Python, *because it scales better* (yes, functional programming is
_conceptually_ perfect, but one can never find sufficiently large teams
of people with the right highly-abstract mathematical mindset and at the
same time with sufficiently down-to-earth pragmaticity -- so, for _real
world_ uses, Python scales better). When I was unable to convince top
management, at the firm at which I was the top programmer, that the firm
should move to Python (beyond the pilot projects which I led and gave
such stellar results), I quit, and for years I made a great living as a
freelance consultant (mostly in Python -- once in a while, a touch of
Pyrex, C or C++ as a vigorish;-).

That's how come I ended up working at the firm supplying your free mail
(as Uber Tech Lead) -- they reached across an ocean to lure me to move
from my native Italy to California, and my proven excellence in Python
was their prime motive. The terms of their offer were just too
incredible to pass by... so, I rapidly got my O1 visa ("alien of
exceptional skills"), and here I am, happily ubertechleading... and
enjoying Python and its incredibly good scalability every single day!
Alex


How do you define scalability?

--
This is a song that took me ten years to live and two years to write.
- Bob Dylan
May 6 '06 #21

P: n/a
Bill Atkins wrote:
"Kay Schluehr" <ka**********@gmx.net> writes:
And then the 12th vanished Lisper returns and Lispers are not
suppressed anymore and won't be loosers forever. The world will be


The mark of a true loser is the inability to spell 'loser.' Zing!


There is not much lost.
them as zealots, equipped with the character of suicide bombers. No,


A very reasonable comparison. Yes, the more I think about it, we Lisp
programmers are a lot like suicide bombers.


Allah Inschallah

May 6 '06 #22

P: n/a
Bill Atkins wrote:
<cut>

How do you define scalability?

http://www.google.com/search?hl=en&q...=Google+Search

;-)

--
mph
May 6 '06 #23

P: n/a
"Martin P. Hellwig" <mh******@xs4all.nl> writes:
Bill Atkins wrote:
<cut>

How do you define scalability?

http://www.google.com/search?hl=en&q...=Google+Search

;-)

--
mph


OK, my real question is: what features of Python make it "scalable"?

--
This is a song that took me ten years to live and two years to write.
- Bob Dylan
May 6 '06 #24

P: n/a
Bill Atkins wrote:
OK, my real question is: what features of Python make it "scalable"?


Let me guess: Python makes it easier to scale the application on
the "features" axis, and the approach to large-scale computation
taken by google makes Python's poor raw performance not so big
an issue, so it doesn't prevent the application from scaling
on the "load" and "amount of data" axes. I also guess that python
is often used to control simple, fast C/C++ programs, or even
to generate such programs.

Best regards
Tomasz
May 6 '06 #25

P: n/a
Bill Atkins wrote:
"Martin P. Hellwig" <mh******@xs4all.nl> writes:
Bill Atkins wrote:
<cut>
How do you define scalability?

http://www.google.com/search?hl=en&q...=Google+Search

;-)

--
mph


OK, my real question is: what features of Python make it "scalable"?

Well I'm no expert, but I guess the ease of creating network services
and clients make it quite scalable. For example, I'm creating a
xmlrpcserver that returns a randomized cardlist, but I because of
fail-over I needed some form of scalability , my solution was to first
randomize the deck then marshal it and dump the file on a ZFS partition,
giving back the client a ticket number, the client can then connect with
the ticket number to receive the cardlist (read the file - unmarshal it).

While this is overkill for 1 server, I needed multiple because of
fail-over and load-balancing, in this case I have 3 'crypto' boxes (with
hardware crypto engines using OpenBSD) doing only the randomizing and 4
solaris machines doing the zfs and distribution of the list.

By using xmlrpc and DNS round-robin, I can just add boxes and it scales
without any problem, The ZFS boxes are the front-end listening to the
name 'shuffle' and are connecting to a private network to my crypto
boxes listening to the name 'crypto'.

So as long as I make DNS aliases (I have a little script that hearbeats
the boxes and when not responding within 10 seconds removes it alias)
and install the right scripts on the box I can scale till I'm round the
earth. Of course when the machine amount gets over a certain degree I
have to add some management functionality.

Now I don't say that I handle this situation well and that its the right
solution, but it worked for me and it was easy and fun to do with
python, but I guess that any language in this sence should be 'scalable'
and perhaps other languages have even better built-in networking
libraries but I'm not a professional programmer and until I learn other
languages (and are comfortable enough to use it) I'll keep on using
python for my projects.

For me python is easy, scalable, fun and by this the 'best' but that is
personal and I simply don't know whether my opinion will change in the
future or not.

--
mph
May 6 '06 #26

P: n/a
Also addressing the Python and scaling question is the
kamaelia.sourceforge.net project whos objective is to solve the
problems of putting the BBCs vast archives on the web, and who use
Python.
-- Pad.

May 6 '06 #27

P: n/a
"Martin P. Hellwig" <mh******@xs4all.nl> writes:
and clients make it quite scalable. For example, I'm creating a
xmlrpcserver that returns a randomized cardlist, but I because of
fail-over I needed some form of scalability , my solution was to first
randomize the deck then marshal it and dump the file on a ZFS
partition, giving back the client a ticket number, the client can then
connect with the ticket number to receive the cardlist (read the file
- unmarshal it).
This is a weird approach. Why not let the "ticket" by the (maybe
encrypted) PRNG seed that generates the permutation?
While this is overkill for 1 server, I needed multiple because of
fail-over and load-balancing, in this case I have 3 'crypto' boxes
(with hardware crypto engines using OpenBSD) doing only the
randomizing and 4 solaris machines doing the zfs and distribution of
the list.


I don't know what good that hardware crypto is doing you, if you're
then writing out the shuffled deck to disk in the clear.
May 6 '06 #28

P: n/a
Paul Rubin <http://ph****@NOSPAM.invalid> writes:
I don't know what good that hardware crypto is doing you, if you're
then writing out the shuffled deck to disk in the clear.


Ehhh, I guess you want the crypto hardware to generate physical
randomness for each shuffle. I'm skeptical of the value of this since
a cryptographic PRNG seeded with good entropy is supposed to be
computationally indistinguishable from physical randomness, and if
it's not, we're all in big trouble; further, that hardware engine is
almost certainly doing some cryptographic whitening, which is a
problem if you don't think that cryptography works.

Anyway, if it's just a 52-card deck you're shuffling, there's only
about 226 bits of entropy per shuffle, or 52*6 = 312 bits if you write
out the permutation straightforwardly as a vector. You could use that
as the ticket but if you're generating it that way you may need to
save the shuffle for later auditing.

For practical security purposes I'd be happier generating the shuffles
entirely inside the crypto module (HSM) by cryptographic means, with
the "ticket" just being a label for a shuffle. E.g. let

K1, K2 = secret keys

T(n) = ticket #n = AES(K1, n) to prevent clients from guessing
ticket numbers

shuffle(n) = HMAC-SHA-384(K2, n) truncated to 312 bits, treated as
permutation on 52 cards

You could put some of the card dealing logic into the HSM to get the
cards dealt out only as the game as played, to decrease the likelihood
of any cards getting exposed prematurely.
May 6 '06 #29

P: n/a
Paul Rubin wrote:
"Martin P. Hellwig" <mh******@xs4all.nl> writes:
and clients make it quite scalable. For example, I'm creating a
xmlrpcserver that returns a randomized cardlist, but I because of
fail-over I needed some form of scalability , my solution was to first
randomize the deck then marshal it and dump the file on a ZFS
partition, giving back the client a ticket number, the client can then
connect with the ticket number to receive the cardlist (read the file
- unmarshal it).
This is a weird approach. Why not let the "ticket" by the (maybe
encrypted) PRNG seed that generates the permutation?


Because the server that handles the generate request doesn't need to be
the same as the one that handles the request to give the client that
deck. Even more, the server that handles the request calls the crypto
servers to actually do the shuffling. So when the server fails before it
has given the client the ticket, it could be possible that a deck is
already created but not used, no biggie there.
But if the ticket is given to the client, than any other server can
serve back that ticket to give the shuffled deck, unless the ZFS dies of
course but then again thats why I use ZFS so I can mirror them om 4
different machines in 2 different locations.
While this is overkill for 1 server, I needed multiple because of
fail-over and load-balancing, in this case I have 3 'crypto' boxes
(with hardware crypto engines using OpenBSD) doing only the
randomizing and 4 solaris machines doing the zfs and distribution of
the list.


I don't know what good that hardware crypto is doing you, if you're
then writing out the shuffled deck to disk in the clear.


It's not about access security it's more about the best possible
randomness to shuffle the deck.

--
mph
May 6 '06 #30

P: n/a


Thomas F. Burdick wrote:
Ken Tilton <ke*******@gmail.com> writes:

<g> Hopefully it can be a big issue and still not justify a flame war.

Mileages will always vary, but one reason for lambda is precisely not
to have to stop, go make a new function for this one very specific
use, come back and use it as the one lambda statement, or in C have an
address to pass. but, hey, what are editors for? :)

the bigger issue is the ability of a lambda to close over arbitrary
lexically visible variables. this is something the separate function
cannot see, so one has to have a function parameter for everything.

but is such lexical scoping even on the table when Ptyhon's lambda
comes up for periodic review?

This is second-hand, as I don't actually follow Python closely, but
from what I've heard, they now have reasonable scoping rules (or maybe
they're about to, I'm not sure). And you can use def as a
Scheme-style inner define, so it's essentially a LABELS that gets the
indentation wrong.


Cool. And I know how much you like labels/flet. :)
This means they have proper closures, just not
anonymous ones. And an egregiously misnamed lambda that should be
fixed or thrown out.

If Python gets proper macros it won't matter one bit that they only
have named closures, since you can macro that away in a blink of an
eye.


Ah, well, there we go again. Without sexpr notation, the lexer/parser
again will be "hard", and "hardly worth it": we get even more sour
grapes, this time about macros not being such a big deal.

One of the hardest things for a technologist to do is admit that a neat
idea has to be abandoned. Initial success creates a giddy
over-commitment to the design choice. After then all difficulties get
brushed aside or kludged.

This would not be a problem for Python if it had stayed a scripting
language... well, maybe "no Macro!s" and "no real lambda!" and "no
continuations!" are GvR's way of keeping Python just a scripting language.

:)

kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #31

P: n/a


Martin P. Hellwig wrote:
Bill Atkins wrote:
<cut>

How do you define scalability?
http://www.google.com/search?hl=en&q...=Google+Search


Damn! Google can do that?! Omigod!!! Not joking, I never knew that,a
lways used dictionary.com. Thx! I meant:
The ability to add power and capability to an existing system without significant expense or overhead.
www.yipes.com/care/cc_glossary.shtml


The number of definitions explains why most respondents should save
their breath. Natural language is naturally ambiguous. Meanwhile Usenet
is the perfect place to grab one meaning out of a dozen and argue over
the implications of that one meaning which of course is never the one
originally intended, as any reasonable, good faith reader would admit.

kenny
--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #32

P: n/a


Kay Schluehr wrote:
Ken Tilton wrote:

Oh, my, you are preaching to the herd (?!) of lemmings?! Please tell me
you are aware that lemmings do not have ears. You should just do Lisp
all day and add to the open source libraries to speed Lisp's ascendance.
The lemmings will be liberated the day Wired puts John McCarthy on the
cover, and not a day sooner anyway.

And then the 12th vanished Lisper returns and Lispers are not
suppressed anymore and won't be loosers forever. The world will be
united in the name of Lisp and Lispers will be leaders and honorables.
People stop worrying about Lispers as psychpaths and do not consider
them as zealots, equipped with the character of suicide bombers. No,
Lisp means peace and paradise.


"The Twelfth Vanished Lisper"? I love it. Must start a secret society....

:)

kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #33

P: n/a
"Martin P. Hellwig" <mh******@xs4all.nl> writes:
This is a weird approach. Why not let the "ticket" by the (maybe
encrypted) PRNG seed that generates the permutation?
Because the server that handles the generate request doesn't need to
be the same as the one that handles the request to give the client
that deck.


Wait a sec, are you giving the entire shuffled deck to the client?
Can you describe the application? I was imagining an online card game
where clients are playing against each other. Letting any client see
the full shuffle is disastrous.
But if the ticket is given to the client, than any other server can
serve back that ticket to give the shuffled deck, unless the ZFS dies
of course but then again thats why I use ZFS so I can mirror them om 4
different machines in 2 different locations.

I don't know what good that hardware crypto is doing you, if you're
then writing out the shuffled deck to disk in the clear.


It's not about access security it's more about the best possible
randomness to shuffle the deck.


Depending on just what the server is for, access security may be a far
more important issue. If I'm playing cards online with someone, I'd
be WAY more concerned about the idea of my opponent being able to see
my cards by breaking into the server, than his being able to
cryptanalyze a well-designed PRNG based solely on its previous
outputs.
May 6 '06 #34

P: n/a
Paul Rubin wrote:
"Martin P. Hellwig" <mh******@xs4all.nl> writes:
This is a weird approach. Why not let the "ticket" by the (maybe
encrypted) PRNG seed that generates the permutation? Because the server that handles the generate request doesn't need to
be the same as the one that handles the request to give the client
that deck.


Wait a sec, are you giving the entire shuffled deck to the client?
Can you describe the application? I was imagining an online card game
where clients are playing against each other. Letting any client see
the full shuffle is disastrous.


Nope I have a front end service that does the client bit, its about this
(in this context, there are more services of course):

crypto - ZFS - table servers - mirror dispatching - client xmlrpc access
- client ( last one has not been written yet )

<cut>
Depending on just what the server is for, access security may be a far
more important issue. If I'm playing cards online with someone, I'd
be WAY more concerned about the idea of my opponent being able to see
my cards by breaking into the server, than his being able to
cryptanalyze a well-designed PRNG based solely on its previous
outputs.


Only client xmlrpc access is (should be) accessible from the outside and
since this server is user session based they only see their own card.
However this project is still in it's early development, I'm doing now
initial alpha-tests (and stress testing) and after this I'm going to let
some audit bureau's check for security (probably Madison-Ghurka, but I
haven't asked them yet).

--
mph
May 6 '06 #35

P: n/a
[ I pruned the cross-posting down to a reasonable level ]

Ken Tilton <ke*******@gmail.com> writes:
Thomas F. Burdick wrote:
This is second-hand, as I don't actually follow Python closely, but
from what I've heard, they now have reasonable scoping rules (or maybe
they're about to, I'm not sure). And you can use def as a
Scheme-style inner define, so it's essentially a LABELS that gets the
indentation wrong.


Cool. And I know how much you like labels/flet. :)


Well, I love LABELS but I hate inner defines with an equal passion --
so for me it's a wash :-)

As much as I like nice low-level, close-to-the-machine mechanisms as
labels and lambda, sometimes you just want the high-level
expressiveness of tagbody/go, which Python doesn't have ... which in
my opinion is quite a crime to readability and the ability to
transcribe Knuth algorithms, which any engineer should find offensive
to their sensibilities.
This means they have proper closures, just not
anonymous ones. And an egregiously misnamed lambda that should be
fixed or thrown out.
If Python gets proper macros it won't matter one bit that they only
have named closures, since you can macro that away in a blink of an
eye.


Ah, well, there we go again. Without sexpr notation, the lexer/parser
again will be "hard", and "hardly worth it": we get even more sour
grapes, this time about macros not being such a big deal.

One of the hardest things for a technologist to do is admit that a
neat idea has to be abandoned. Initial success creates a giddy
over-commitment to the design choice. After then all difficulties get
brushed aside or kludged.


Y'never know, they could always Greenspun their way to almost-sexps.
What with the way that selective pressure works, it's gonna be that or
die, so it is a possibility.
May 6 '06 #36

P: n/a
Ken Tilton <ke*******@gmail.com> wrote:
Martin P. Hellwig wrote:
Bill Atkins wrote:
<cut>

How do you define scalability?
http://www.google.com/search?hl=en&q...=Google+Search


Damn! Google can do that?! Omigod!!! Not joking, I never knew that,a


You're welcome; we do have several little useful tricks like that.
lways used dictionary.com. Thx! I meant:
The ability to add power and capability to an existing system without
significant expense or overhead. www.yipes.com/care/cc_glossary.shtml

Excellent -- just the definition of "scalability" that Google and its
competitor live and die by ((OK, OK, I'm _not_ implying that such issues
as usability &c don't matter, by no means -- but, I live mostly in the
world of infrastructure, where scalability and reliability reign)).

The number of definitions explains why most respondents should save
their breath. Natural language is naturally ambiguous. Meanwhile Usenet
is the perfect place to grab one meaning out of a dozen and argue over
the implications of that one meaning which of course is never the one
originally intended, as any reasonable, good faith reader would admit.


However, you and I are/were discussing exactly the same nuance of
meaning, either by a funny quirk of fate or because it's the one that
really matters in large-scale programming (and more generally,
large-scale systems). E.g., if your existing system can gracefully
handle your current traffic of, say, a billion queries of complexity X,
you want to be able to rapidly add a feature that will increase the
average query's complexity to (X+dX) and attract 20% more users, so
you'll need to handle 1.2 billion queries just as gracefully: i.e., you
need to be able to add power and capability to your existing system,
rapidly and reliably, just as that definition says.

When this is the challenge, your choice of programming language is not
the first order of business, of course -- your hardware and network
architecture loom large, and so does the structuring of your
applications and infrastructure software across machines and networks.
Still, language does matter, at a "tertiary" level if you will. Among
the potential advantages of Lisp is the fact that you could use Lisp
across almost all semantic levels ("almost" because I don't think "Lisp
machines" are a realistic option nowadays, so lower levels of the stack
would remain in C and machine language -- but those levels may probably
be best handled by a specialized squad of kernel-level and device-driver
programmers, anyway); among the potential advantages of Python, the fact
that (while not as suited as Lisp to lower-level coding, partly because
of a lack of good solid compilers to make machine language out of it),
it brings a powerful drive to uniformity, rather than a drive towards a
host of "domain-specific" Little Languages as is encouraged by Lisp's
admirably-powerful macro system.

One key axis of scalability here is, how rapidly can you grow the teams
of people that develop and maintain your software base? To meet all the
challenges and grasp all the opportunities of an exploding market,
Google has had to almost-double its size, in terms of number of
engineers, every year for the last few years -- I believe that doing so
while keeping stellar quality and productivity is an unprecedented feat,
and while (again!) the choice of language(s) is not a primary factor
(most kudos must go to our management and its approaches and methods, of
course, and in particular to the strong corporate identity and culture
they managed to develop and maintain), it still does matter. The
uniformity of coding style and practices in our codebase is strong.

We don't demand Python knowledge from all the engineers we hire: for any
"engineering superstar" worth the adjective, Python is really easy and
fast to pick up and start using productively -- I've seen it happen
thousands of times, both in Google and in my previous career, and not
just for engineers with a strong software background, but also for those
whose specialties are hardware design, network operations, etc, etc. The
language's simplicity and versatility allow this. Python "fits people's
brains" to an unsurpassed extent -- in a way that, alas, languages
requiring major "paradigm shifts" (such as pure FP languages, or Common
Lisp, or even, say, Smalltalk, or Prolog...) just don't -- they really
require a certain kind of mathematical mindset or predisposition which
just isn't as widespread as you might hope. Myself, I do have more or
less that kind of mindset, please note: while my Lisp and scheme are
nowadays very rusty, proficiency with them was part of what landed me my
first job, over a quarter century ago (microchip designers with a good
grasp of lisp-ish languages being pretty rare, and TI being rather
hungry for them at the time) -- but I must acknowlegde I'm an exception.

Of course, the choice of Python does mean that, when we really truly
need a "domain specific little language", we have to implement it as a
language in its own right, rather than piggybacking it on top of a
general-purpose language as Lisp would no doubt afford; see
<http://labs.google.com/papers/sawzall.html> for such a DSLL developed
at Google. However, I think this tradeoff is worthwhile, and, in
particular, does not impede scaling.
Alex
May 6 '06 #37

P: n/a
"John Bokma" <jo**@castleamber.com> wrote in message
news:Xn************************@130.133.1.4...
Reported for excessive crossposting.


Did u report yourself?

--
LTP

:)
May 6 '06 #38

P: n/a


Alex Martelli wrote:
Ken Tilton <ke*******@gmail.com> wrote:

Martin P. Hellwig wrote:
Bill Atkins wrote:
<cut>

How do you define scalability?
http://www.google.com/search?hl=en&q...=Google+Search

Damn! Google can do that?! Omigod!!! Not joking, I never knew that,a

You're welcome; we do have several little useful tricks like that.

lways used dictionary.com. Thx! I meant:

The ability to add power and capability to an existing system without
significant expense or overhead. www.yipes.com/care/cc_glossary.shtml

Excellent -- just the definition of "scalability" that Google and its
competitor live and die by ((OK, OK, I'm _not_ implying that such issues
as usability &c don't matter, by no means -- but, I live mostly in the
world of infrastructure, where scalability and reliability reign)).
The number of definitions explains why most respondents should save
their breath. Natural language is naturally ambiguous. Meanwhile Usenet
is the perfect place to grab one meaning out of a dozen and argue over
the implications of that one meaning which of course is never the one
originally intended, as any reasonable, good faith reader would admit.

However, you and I are/were discussing exactly the same nuance of
meaning, either by a funny quirk of fate or because it's the one that
really matters in large-scale programming (and more generally,
large-scale systems). E.g., if your existing system can gracefully
handle your current traffic of, say, a billion queries of complexity X,
you want to be able to rapidly add a feature that will increase the
average query's complexity to (X+dX) and attract 20% more users, so
you'll need to handle 1.2 billion queries just as gracefully: i.e., you
need to be able to add power and capability to your existing system,
rapidly and reliably, just as that definition says.

When this is the challenge, your choice of programming language is not
the first order of business, of course ...


Looks like dictionaries are no match for the ambiguity of natural
language. :) Let me try again: it is Python itself that cannot scale, as
in gain "new power and capability", and at least in the case of lambda
it seems to be because of indentation-sensitivity.

Is that not what GvR said?

By contrast, in On Lisp we see Graham toss off Prolog in Chapter 22 and
an object system from scratch in Chapter 25. Lite versions, to be sure,
but you get the idea.

My sig has a link to a hack I developed after doing Lisp for less than a
month, and without lambda (and to a lesser degree macros) it would be
half the tool it is. It adds a declarative paradigm to the CL object
system, and is built on nothing but ansi standard Lisp. Yet it provides
new power and capability. And that by an application programmer just
working on a nasty problem, never mind the language developer.

I just find it interesting that sexpr notation (which McCarthy still
wants to toss!) is such a huge win, and that indentation seems to be so
limiting.
-- your hardware and network architecture loom large, and so does the structuring of your
applications and infrastructure software across machines and networks.
Still, language does matter, at a "tertiary" level if you will. Among
the potential advantages of Lisp is the fact that you could use Lisp
across almost all semantic levels ("almost" because I don't think "Lisp
machines" are a realistic option nowadays, so lower levels of the stack
would remain in C and machine language -- but those levels may probably
be best handled by a specialized squad of kernel-level and device-driver
programmers, anyway); among the potential advantages of Python, the fact
that (while not as suited as Lisp to lower-level coding, partly because
of a lack of good solid compilers to make machine language out of it),
it brings a powerful drive to uniformity, rather than a drive towards a
host of "domain-specific" Little Languages as is encouraged by Lisp's
admirably-powerful macro system.

One key axis of scalability here is, how rapidly can you grow the teams
of people that develop and maintain your software base?
I am with Brooks on the Man-Month myth, so I am more interested in /not/
growing my team. If Lisp is <pick a number, any numer> times more
expressive than Python, you need exponentially fewer people.

In some parallel universe Norvig had the cojones to dictate Lisp to
Google and they listened, and in that universe... I don't know, maybe
GMail lets me click on the sender column to sort my mail? :)
To meet all the
challenges and grasp all the opportunities of an exploding market,
Google has had to almost-double its size, in terms of number of
engineers, every year for the last few years -- I believe that doing so
while keeping stellar quality and productivity is an unprecedented feat,
and while (again!) the choice of language(s) is not a primary factor
(most kudos must go to our management and its approaches and methods, of
course, and in particular to the strong corporate identity and culture
they managed to develop and maintain), it still does matter. The
uniformity of coding style and practices in our codebase is strong.
Well, you said it for me. Google hires the best and pays a lot. Hey, I
wrote great code in Cobol. So as much as you want to brag on yourself
and Google <g>, your success does not address:

Indentation-sensitivity: Is it holding Python back?

We don't demand Python knowledge from all the engineers we hire: for any
"engineering superstar" worth the adjective, Python is really easy and
fast to pick up and start using productively -- I've seen it happen
thousands of times, both in Google and in my previous career, and not
just for engineers with a strong software background, but also for those
whose specialties are hardware design, network operations, etc, etc. The
language's simplicity and versatility allow this. Python "fits people's
brains" to an unsurpassed extent -- in a way that, alas, languages
requiring major "paradigm shifts" (such as pure FP languages, or Common
Lisp, or even, say, Smalltalk, or Prolog...) just don't -- they really
require a certain kind of mathematical mindset or predisposition which
just isn't as widespread as you might hope.
Talk about Lisp myths. The better the language, the easier the language.
And the best programmers on a team get to develop tools and macrology
that empower the lesser lights, so (a) they have fun work that keeps
them entertained while (b) the drones who just want to get through the
day are insanely productive, too.

Another myth (or is this the same?) is this "pure FP" thing. Newbies can
and usually do code as imperatively as they wanna be. Until someone else
sees their code, tidies it up, and the light bulb goes on. But CL does
not force a sharp transition on anyone.

Myself, I do have more or
less that kind of mindset, please note: while my Lisp and scheme are
nowadays very rusty, proficiency with them was part of what landed me my
first job, over a quarter century ago (microchip designers with a good
grasp of lisp-ish languages being pretty rare, and TI being rather
hungry for them at the time) -- but I must acknowlegde I'm an exception.

Of course, the choice of Python does mean that, when we really truly
need a "domain specific little language", we have to implement it as a
language in its own right, rather than piggybacking it on top of a
general-purpose language as Lisp would no doubt afford; see
<http://labs.google.com/papers/sawzall.html> for such a DSLL developed
at Google.


No lambdas? Static typing?! eewwwewww. :) Loved the movie, tho.

Come on, try just one meaty Common Lisp project at Google. Have someone
port Cells to Python. I got halfway done but decided I would rather be
doing Lisp. uh-oh. Does Python have anything like special variables? :)

Kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #39

P: n/a


Ken Tilton wrote:

Come on, try just one meaty Common Lisp project at Google. Have someone
port Cells to Python. I got halfway done but decided I would rather be
doing Lisp. uh-oh. Does Python have anything like special variables? :)


Omigod. I scare myself sometimes. This would be a great Summer of Code
project. Port Cells (see sig) to Python. Trust me, this is Silver Bullet
stuff. (Brooks was wrong on that.)

If a strong Pythonista wants to submit a proposal, er, move fast. I am
mentoring through LispNYC: http://www.lispnyc.org/soc.clp

Gotta be all over Python metaclasses, and everything else pure Python.
PyGtk would be a good idea for the demo, which will involve a GUI mini
app. Just gotta be able to /read/ Common Lisp.

kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 6 '06 #40

P: n/a
Ken Tilton <ke*******@gmail.com> wrote:
...
Looks like dictionaries are no match for the ambiguity of natural
language. :) Let me try again: it is Python itself that cannot scale, as
in gain "new power and capability", and at least in the case of lambda
it seems to be because of indentation-sensitivity.
In my opinion (and that of several others), the best way for Python to
grow in this regard would be to _lose_ lambda altogether, since named
functions are preferable (and it's an acknowledged Python design
principle that there should ideally be just one obvious way to perform a
task); GvR used to hold the same opinion, but changed his mind recently,
alas, so we'll keep the wart.

But, quite apart from the whole issue of whether it's desirable to
languages to change massively ("add new power and capability" meaning
new enriched features in the language itself), your whole argument is
bogus: it's obvious that _any_ fundamental design choice in an artefact
will influence the feasibility and desirability of future design choices
in future releases of that same, identical artefact. At a syntax-sugar
level, for example, Lisp's choice to use parentheses as delimiter means
it's undesirable, even unfeasible, to use the single character '(' as an
ordinary identifier in a future release of the language. Considering
this to mean that Lisp "cannot scale" is just as ridiculous as
considering that Python "cannot scale" by not having an elegant way to
make lambdas heavier and richer -- totally laughable and idiotic. ``An
unneeded feature "cannot" be added (elegantly) in future releases of the
language'' is just as trivial and acceptable for the unneded feature
``allow ( as an ordinary single-character identifier'' as for the
unneded feature ``allow unnamed functions with all the flexibility of
named ones''.
By contrast, in On Lisp we see Graham toss off Prolog in Chapter 22 and


Oh, is that the same Graham who writes:

"""
A friend of mine who knows nearly all the widely used languages uses
Python for most of his projects. He says the main reason is that he
likes the way source code looks. That may seem a frivolous reason to
choose one language over another. But it is not so frivolous as it
sounds: when you program, you spend more time reading code than writing
it. You push blobs of source code around the way a sculptor does blobs
of clay. So a language that makes source code ugly is maddening to an
exacting programmer, as clay full of lumps would be to a sculptor.
"""
....? [[ I suspect that friend is in fact a common friend of mine and
Graham's, the guy you also mention later in your post, and who
introduced Graham and me when G recently came talk at Google (we had
"brushed" before, speaking in the same sessions at conferences and the
like, but had never "met", as in, got introduced and _talked_...;-). ]]

But, no matter, let's get back to Graham's point: significant
indentation is a large part of what gives Python its own special beauty,
uncluttered by unneeded punctuation. And while you, I, Graham, and that
common friend of ours, might likely agree that Lisp, while entirely
different, has its own eerie beauty, most people's aesthetics are poles
apart from that (why else would major pure-FP languages such as *ML and
Haskell entirely reject Lisp's surface syntax, willingly dropping the
ease of macros, to introduce infix operator syntax etc...? obviously,
their designers' aesthetics weigh parenthesized prefixsyntax negatively,
despite said designers' undeniable depth, skill and excellence).
Alex
May 6 '06 #41

P: n/a
al*****@yahoo.com (Alex Martelli) writes:
Ken Tilton <ke*******@gmail.com> wrote:
...
Looks like dictionaries are no match for the ambiguity of natural
language. :) Let me try again: it is Python itself that cannot scale, as
in gain "new power and capability", and at least in the case of lambda
it seems to be because of indentation-sensitivity.


In my opinion (and that of several others), the best way for Python to
grow in this regard would be to _lose_ lambda altogether, since named
functions are preferable (and it's an acknowledged Python design
principle that there should ideally be just one obvious way to perform a
task); GvR used to hold the same opinion, but changed his mind recently,
alas, so we'll keep the wart.

But, quite apart from the whole issue of whether it's desirable to
languages to change massively ("add new power and capability" meaning
new enriched features in the language itself), your whole argument is
bogus: it's obvious that _any_ fundamental design choice in an artefact
will influence the feasibility and desirability of future design choices
in future releases of that same, identical artefact. At a syntax-sugar
level, for example, Lisp's choice to use parentheses as delimiter means
it's undesirable, even unfeasible, to use the single character '(' as an
ordinary identifier in a future release of the language. Considering
this to mean that Lisp "cannot scale" is just as ridiculous as
considering that Python "cannot scale" by not having an elegant way to
make lambdas heavier and richer -- totally laughable and idiotic. ``An
unneeded feature "cannot" be added (elegantly) in future releases of the
language'' is just as trivial and acceptable for the unneded feature
``allow ( as an ordinary single-character identifier'' as for the
unneded feature ``allow unnamed functions with all the flexibility of
named ones''.


Not so infeasible:

(let ((|bizarrely(named()symbol| 3))
(+ |bizarrely(named()symbol| 4))

;; => 7

And in any case, enforced indentation is a policy with vastly more
serious consequences than the naming of identifiers.
By contrast, in On Lisp we see Graham toss off Prolog in Chapter 22 and


Oh, is that the same Graham who writes:

"""
A friend of mine who knows nearly all the widely used languages uses
Python for most of his projects. He says the main reason is that he
likes the way source code looks. That may seem a frivolous reason to
choose one language over another. But it is not so frivolous as it
sounds: when you program, you spend more time reading code than writing
it. You push blobs of source code around the way a sculptor does blobs
of clay. So a language that makes source code ugly is maddening to an
exacting programmer, as clay full of lumps would be to a sculptor.
"""
...? [[ I suspect that friend is in fact a common friend of mine and
Graham's, the guy you also mention later in your post, and who
introduced Graham and me when G recently came talk at Google (we had
"brushed" before, speaking in the same sessions at conferences and the
like, but had never "met", as in, got introduced and _talked_...;-). ]]

But, no matter, let's get back to Graham's point: significant
indentation is a large part of what gives Python its own special beauty,
uncluttered by unneeded punctuation. And while you, I, Graham, and that
common friend of ours, might likely agree that Lisp, while entirely
different, has its own eerie beauty, most people's aesthetics are poles
apart from that (why else would major pure-FP languages such as *ML and
Haskell entirely reject Lisp's surface syntax, willingly dropping the
ease of macros, to introduce infix operator syntax etc...? obviously,
their designers' aesthetics weigh parenthesized prefixsyntax negatively,
despite said designers' undeniable depth, skill and excellence).
Alex


--
This is a song that took me ten years to live and two years to write.
- Bob Dylan
May 6 '06 #42

P: n/a


Alex Martelli wrote:
Ken Tilton <ke*******@gmail.com> wrote:
...
Looks like dictionaries are no match for the ambiguity of natural
language. :) Let me try again: it is Python itself that cannot scale, as
in gain "new power and capability", and at least in the case of lambda
it seems to be because of indentation-sensitivity.

In my opinion (and that of several others), the best way for Python to
grow in this regard would be to _lose_ lambda altogether, since named
functions are preferable (and it's an acknowledged Python design
principle that there should ideally be just one obvious way to perform a
task); GvR used to hold the same opinion, but changed his mind recently,
alas, so we'll keep the wart.


Yes, I am enjoying watching lambda teetering on the brink. So it has
been re-upped for another tour? Go, lambda! Go, lambda!

But, quite apart from the whole issue of whether it's desirable to
languages to change massively ("add new power and capability" meaning
new enriched features in the language itself), your whole argument is
bogus: it's obvious that _any_ fundamental design choice in an artefact
will influence the feasibility and desirability of future design choices
in future releases of that same, identical artefact.
True but circular, because my very point is that () was a great design
choice in that it made macros possible and they made CL almost
infinitely extensible, while indentation-sensitivity was a mistaken
design choice because it makes for very clean code (I agree
wholeheartedly) but placed a ceiling on its expressiveness.

As for:
At a syntax-sugar
level, for example, Lisp's choice to use parentheses as delimiter means
it's undesirable, even unfeasible, to use the single character '(' as an
ordinary identifier in a future release of the language.
(defun |(| (aside) (format nil "Parenthetically speaking...~a." aside))
=> |(|
(|(| "your Lisp /is/ rusty.")
=> "Parenthetically speaking...your Lisp /is/ rusty.."

:) No, seriously, is that all you can come up with?
Considering
this to mean that Lisp "cannot scale" is just as ridiculous as
considering that Python "cannot scale" by not having an elegant way to
make lambdas heavier and richer -- totally laughable and idiotic.
Harsh. :) I demand satisfaction. See end of article.
``An
unneeded feature "cannot" be added (elegantly) in future releases of the
language'' is just as trivial and acceptable for the unneded feature
``allow ( as an ordinary single-character identifier'' as for the
unneded feature ``allow unnamed functions with all the flexibility of
named ones''.

By contrast, in On Lisp we see Graham toss off Prolog in Chapter 22 and

Oh, is that the same Graham who writes:


So we are going to skip the point I was making about Common Lisp being
so insanely extensible? By /application/ programmers? Hell, for all we
know CL does have a BDFL, we just do not need their cooperation.

"""
A friend of mine who knows nearly all the widely used languages uses
Python for most of his projects. He says the main reason is that he
likes the way source code looks.


No argument. The little Python I wrote while porting Cells to Python was
strikingly attractive. But it was a deal with the devil, unless Python
is content to be just a scripting language. (And it should be.)

OK, I propose a duel. We'll co-mentor this:

http://www.lispnyc.org/wiki.clp?page=PyCells

In the end Python will have a Silver Bullet, and only the syntax will
differ, because Python has a weak lambda, statements do not always
return values, it does not have macros, and I do not know if it has
special variables.

Then we can just eyeball the code and see if the difference is
interesting. These abstract discussions do tend to loop.

kenny

--
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
Attorney for Mary Winkler, confessed killer of her
minister husband, when asked if the couple had
marital problems.
May 7 '06 #43

P: n/a
Bill Atkins <NO**********@rpi.edu> wrote:
...
``allow ( as an ordinary single-character identifier'' as for the
unneded feature ``allow unnamed functions with all the flexibility of
named ones''.
Not so infeasible:

(let ((|bizarrely(named()symbol| 3))
(+ |bizarrely(named()symbol| 4))

;; => 7


Read again what I wrote: I very specifically said "ordinary
*single-character* identifier" (as opposed to "one of many characters
inside a multi-character identifier"). Why do you think I said
otherwise, when you just quoted what I had written? (Even just a
_leading_ ( at the start of an identifier may be problematic -- and just
as trivial as having to give names to functions, of course, see below).

And in any case, enforced indentation is a policy with vastly more
serious consequences than the naming of identifiers.


So far, what was being discussed here isn't -- having to use an
identifier for an object, rather than keeping it anonymous -- trivial.
Python practically enforces names for several kinds of objects, such as
classes and modules as well as functions ("practically" because you CAN
call new.function(...), type(...), etc, where the name is still there
but might e.g. be empty -- not a very practical alternative, though) --
so what? Can you have an unnamed macro in Lisp? Is being "forced" to
name it a "serious consequence"? Pah.

Anyway, I repeat: *any* design choice (in a language, or for that matter
any other artefact) has consequences. As Paul Graham quotes and
supports his unnamed friend as saying, Python lets you easily write code
that *looks* good, and, as Graham argues, that's an important issue --
and, please note, a crucial consequence of using significant
indentation. Alien whitespace eating nanoviruses are no more of a worry
than alien parentheses eating nanoviruses, after all.
Alex
May 7 '06 #44

P: n/a
al*****@yahoo.com (Alex Martelli) writes:
Bill Atkins <NO**********@rpi.edu> wrote:
...
> ``allow ( as an ordinary single-character identifier'' as for the
> unneded feature ``allow unnamed functions with all the flexibility of
> named ones''.
Not so infeasible:

(let ((|bizarrely(named()symbol| 3))
(+ |bizarrely(named()symbol| 4))

;; => 7


Read again what I wrote: I very specifically said "ordinary
*single-character* identifier" (as opposed to "one of many characters
inside a multi-character identifier"). Why do you think I said
otherwise, when you just quoted what I had written? (Even just a
_leading_ ( at the start of an identifier may be problematic -- and just
as trivial as having to give names to functions, of course, see below).


Well, the same technique can obviously be used for:

(let ((|(| 3)))
(+ |(| 4)))
;; => 7

The length of the identifier is irrelevant...
And in any case, enforced indentation is a policy with vastly more
serious consequences than the naming of identifiers.


So far, what was being discussed here isn't -- having to use an
identifier for an object, rather than keeping it anonymous -- trivial.
Python practically enforces names for several kinds of objects, such as
classes and modules as well as functions ("practically" because you CAN
call new.function(...), type(...), etc, where the name is still there
but might e.g. be empty -- not a very practical alternative, though) --
so what? Can you have an unnamed macro in Lisp? Is being "forced" to
name it a "serious consequence"? Pah.


Common Lisp does not support unnamed macros (how would these be
useful?), but nothing stops me from adding these. What use case do
you envision for anonymous macros?
Anyway, I repeat: *any* design choice (in a language, or for that matter
any other artefact) has consequences. As Paul Graham quotes and
supports his unnamed friend as saying, Python lets you easily write code
that *looks* good, and, as Graham argues, that's an important issue --
and, please note, a crucial consequence of using significant
indentation. Alien whitespace eating nanoviruses are no more of a worry
than alien parentheses eating nanoviruses, after all.
It *is* an important issue, but it's also a subjective issue. I find
Lisp to be far prettier than any syntax-based language, so it's far
from an objective truth that Python code often looks good - or even at
all.

Plus, I can easily write code that looks good without using a language
that enforces indentation rules. Lisp's regular syntax lets Emacs do
it for me with a simple C-M-a C-M-q. What could be easier?


Alex


--
This is a song that took me ten years to live and two years to write.
- Bob Dylan
May 7 '06 #45

P: n/a
Ken Tilton <ke*******@gmail.com> wrote:
...
True but circular, because my very point is that () was a great design
choice in that it made macros possible and they made CL almost
infinitely extensible, while indentation-sensitivity was a mistaken
design choice because it makes for very clean code (I agree
wholeheartedly) but placed a ceiling on its expressiveness.
Having to give functions a name places no "ceiling on expressiveness",
any more than, say, having to give _macros_ a name.

As for:
At a syntax-sugar
level, for example, Lisp's choice to use parentheses as delimiter means
it's undesirable, even unfeasible, to use the single character '(' as an
ordinary identifier in a future release of the language.
(defun |(| (aside) (format nil "Parenthetically speaking...~a." aside))
=> |(|
(|(| "your Lisp /is/ rusty.")
=> "Parenthetically speaking...your Lisp /is/ rusty.."

:) No, seriously, is that all you can come up with?


Interestingly, the SECOND lisper to prove himself unable to read the
very text he's quoting. Reread carefully, *USE THE ***SINGLE***
CHARACTER* ... *AS AN ORDINARY IDENTIFIER*. What makes you read a
``PART OF'' that I had never written? You've shown how to use the
characters as *PART* of an identifier [[and I believe it couldn't be the
very start]], and you appear to believe that this somehow refutes my
assertion?

Are you ready to admit you were utterly wrong, and (while it is indeed
true that my Lisp is rusty) there is nothing in this exchange to show
it, as opposed to showing rustiness in your ability to understand
English? Or shall we move from polite although total dissent right on
to flamewars and namecalling?

The point is, OF COURSE any design choice places limitations on future
design choices; but some limitations are even DESIRABLE (a language
where *every* single isolated character could mean anything whatsoever
would not be "expressive", but rather totally unreadable) or at least
utterly trivial (syntax-sugar level issues most typically are). Wilfully
distorting some such limitation as meaning that one language "can scale"
(when EVERY language inevitably has SOME such limitations) is not even
funny, and clearly characterizes a polemist who is intent on proving a
preconceived thesis, as opposed to an investigator with any real
interest in ascertaining the truth of the matter.
Oh, is that the same Graham who writes:


So we are going to skip the point I was making about Common Lisp being
so insanely extensible? By /application/ programmers? Hell, for all we
know CL does have a BDFL, we just do not need their cooperation.


Yes, we are, because the debate about why it's better for Python (as a
language used in real-world production systems, *SCALABLE* to extremely
large-scale ones) to *NOT* be insanely extensible and mutable is a
separate one -- Python's uniformity of style allows SCALABILITY of
teams, and teams-of-teams, which is as crucial in the real world as
obviously not understood by you (the law you misquoted was about adding
personnel to a LATE project making it later -- nothing to do with how
desirable it can be to add personnel to a large and growing collection
of projects, scaling and growing in an agile, iterative way to meet
equally growing needs and market opportunities).

This specific debate grew from your misuse of "scalable" to mean or
imply "a bazillion feechurz can [[and, implicitly, should]] be added to
a language, and therefore anything that stands in the way of feechuritis
is somehow holding the language back". That's bad enough, even though
in its contextual misuse of "scalable" it breaks new ground, and I don't
want to waste even more time re-treading *old* ground as to whether the
"*insane* extensibility" afforded by macros is a good or a bad thing in
a language to be used for real-world software production (as opposed to
prototyping and research).

"""
A friend of mine who knows nearly all the widely used languages uses
Python for most of his projects. He says the main reason is that he
likes the way source code looks.


No argument. The little Python I wrote while porting Cells to Python was
strikingly attractive. But it was a deal with the devil, unless Python
is content to be just a scripting language. (And it should be.)


It's hard to attribute feelings to a programming language, but, if you
really must, I'd say Pyton aspires to be *useful* -- if all you need is
"just a scripting language", it will be content to be one for you, and
if your need SCALE, well then, PYTHON IS SCALABLE, and will remain a
*SIMPLE, CLEAN, LITTLE AND POWERFUL LANGUAGE* (letting nobody do
anything INSANE to it;-) while scaling up to whatever size of project(s)
you need (including systems so large that they redefine the very concept
of "large scale" -- believe me, once in a while at a conference I make
the mistake of going to some talk about "large scale" this or that, and
invariably stagger out once again with the realization that what's
"large scale" to the world tends to be a neat toy-sized throwaway little
experiment to my current employer).

OK, I propose a duel. We'll co-mentor this:

http://www.lispnyc.org/wiki.clp?page=PyCells

In the end Python will have a Silver Bullet, and only the syntax will
differ, because Python has a weak lambda, statements do not always
return values, it does not have macros, and I do not know if it has
special variables.

Then we can just eyeball the code and see if the difference is
interesting. These abstract discussions do tend to loop.


As a SummerOfCode mentor, I'm spoken for, and can't undertake to mentor
other projects. I do agree that these discussions can be sterile, and
I'll be glad to see what comes of your "pycells" project, but until then
there's little we can do except agree to disagree (save that I'd like
you to acknowledge my point above, regarding what exactly I had said and
the fact that your alleged counterexample doesn't address at all what I
had said -- but, I'll live even without such an acknowledgment).
Alex
May 7 '06 #46

P: n/a
Bill Atkins <NO**********@rpi.edu> wrote:
...

Read again what I wrote: I very specifically said "ordinary
*single-character* identifier" (as opposed to "one of many characters
inside a multi-character identifier"). Why do you think I said
otherwise, when you just quoted what I had written? (Even just a
_leading_ ( at the start of an identifier may be problematic -- and just
as trivial as having to give names to functions, of course, see below).
Well, the same technique can obviously be used for:

(let ((|(| 3)))
(+ |(| 4)))
;; => 7

The length of the identifier is irrelevant...


But it cannot be a SINGLE CHARACTER, *just* the openparenthesis.

Wow, it's incredible to me that you STILL can't read, parse and
understand what I have so clearly expressed and repeated!

Common Lisp does not support unnamed macros (how would these be
useful?), but nothing stops me from adding these. What use case do
you envision for anonymous macros?
None, just like there is none for anonymous functions -- there is
nothing useful I can do with anonymous functions that I cannot do with
named ones.

Anyway, I repeat: *any* design choice (in a language, or for that matter
any other artefact) has consequences. As Paul Graham quotes and
supports his unnamed friend as saying, Python lets you easily write code
that *looks* good, and, as Graham argues, that's an important issue --
and, please note, a crucial consequence of using significant
indentation. Alien whitespace eating nanoviruses are no more of a worry
than alien parentheses eating nanoviruses, after all.


It *is* an important issue, but it's also a subjective issue. I find
Lisp to be far prettier than any syntax-based language, so it's far
from an objective truth that Python code often looks good - or even at
all.


The undeniable truth, the objective fact, is that *to most programmers*
(including ones deeply enamored of Lisp, such as Graham, Tilton, Norvig,
....) Python code looks good; the Lisp code that looks good to YOU (and,
no doubt them), and palatable to me (I have spoken of "eerie beauty"),
just doesn't to most prospective readers. If you program on your own,
or just with a few people who share your tastes, then only your taste
matters; if you want to operate in the real world, maybe, as I've
already pointed out, to build up a successful firm faster than had ever
previously happened, this *DOESN'T SCALE*. Essentially the same issue
I'm explaining on the parallel subthread with Tilton, except that he
fully agrees with my aesthetic sense (quoting Tilton, "No argument. The
little Python I wrote while porting Cells to Python was strikingly
attractive") so this facet of the jewel needed no further belaboring
there.

Plus, I can easily write code that looks good without using a language
that enforces indentation rules. Lisp's regular syntax lets Emacs do
it for me with a simple C-M-a C-M-q. What could be easier?


If you need to edit and reformat other people's code with Emacs to find
it "looks good", you've made my point: code exists to be read, far more
than it's written, and Python's design choice to keep punctuation scarce
and unobtrusive obviates the need to edit and reformat code that way.
Alex

May 7 '06 #47

P: n/a
al*****@yahoo.com (Alex Martelli) writes:
Ken Tilton <ke*******@gmail.com> wrote:
...
True but circular, because my very point is that () was a great design
choice in that it made macros possible and they made CL almost
infinitely extensible, while indentation-sensitivity was a mistaken
design choice because it makes for very clean code (I agree
wholeheartedly) but placed a ceiling on its expressiveness.
Having to give functions a name places no "ceiling on expressiveness",
any more than, say, having to give _macros_ a name.

As for:
> At a syntax-sugar
> level, for example, Lisp's choice to use parentheses as delimiter means
> it's undesirable, even unfeasible, to use the single character '(' as an
> ordinary identifier in a future release of the language.


(defun |(| (aside) (format nil "Parenthetically speaking...~a." aside))
=> |(|
(|(| "your Lisp /is/ rusty.")
=> "Parenthetically speaking...your Lisp /is/ rusty.."

:) No, seriously, is that all you can come up with?


Interestingly, the SECOND lisper to prove himself unable to read the
very text he's quoting. Reread carefully, *USE THE ***SINGLE***
CHARACTER* ... *AS AN ORDINARY IDENTIFIER*. What makes you read a
``PART OF'' that I had never written? You've shown how to use the
characters as *PART* of an identifier [[and I believe it couldn't be the
very start]], and you appear to believe that this somehow refutes my
assertion?


Now I see what the problem is here - you just don't know what you're
talking about. The identifier in Ken's and my samples *is* a single
character identifier. The vertical bars tell the Lisp reader that
what's between them is exempt from other reading rules.

(symbol-name '|(| ) => "("

(length (symbol-name '|(| )) => 1
Are you ready to admit you were utterly wrong, and (while it is indeed
true that my Lisp is rusty) there is nothing in this exchange to show
it, as opposed to showing rustiness in your ability to understand
English? Or shall we move from polite although total dissent right on
to flamewars and namecalling?
Believe it or not, _you_ got it wrong.
The point is, OF COURSE any design choice places limitations on future
design choices; but some limitations are even DESIRABLE (a language
where *every* single isolated character could mean anything whatsoever
would not be "expressive", but rather totally unreadable) or at least
utterly trivial (syntax-sugar level issues most typically are). Wilfully
distorting some such limitation as meaning that one language "can scale"
(when EVERY language inevitably has SOME such limitations) is not even
funny, and clearly characterizes a polemist who is intent on proving a
preconceived thesis, as opposed to an investigator with any real
interest in ascertaining the truth of the matter.
Having to name a variable "paren" instead of "(" is not a serious
restriction. I can't think of a single situation where being able to
do so would be useful.

That said, raw, out-of-the-box Common Lisp can accomodate you if you
both a) need variables named "(" and b) are unwilling to use the bar
syntax. Simply redefine the parenthesis characters in the readtable
(a matter of four function calls) and get this abomination:

{let {{( 3}}
{+ ( 5}}

Lisp places no restrictions on you, even when your goal is as silly as
this one.
> Oh, is that the same Graham who writes:


So we are going to skip the point I was making about Common Lisp being
so insanely extensible? By /application/ programmers? Hell, for all we
know CL does have a BDFL, we just do not need their cooperation.


Yes, we are, because the debate about why it's better for Python (as a
language used in real-world production systems, *SCALABLE* to extremely
large-scale ones) to *NOT* be insanely extensible and mutable is a
separate one -- Python's uniformity of style allows SCALABILITY of
teams, and teams-of-teams, which is as crucial in the real world as
obviously not understood by you (the law you misquoted was about adding
personnel to a LATE project making it later -- nothing to do with how
desirable it can be to add personnel to a large and growing collection
of projects, scaling and growing in an agile, iterative way to meet
equally growing needs and market opportunities).


Buh? The project doesn't have to be late for Brooks's law to hold;
adding programmers, so goes Brooks reasoning, will always increase the
time required to complete the project because of various communication
issues.
This specific debate grew from your misuse of "scalable" to mean or
imply "a bazillion feechurz can [[and, implicitly, should]] be added to
a language, and therefore anything that stands in the way of feechuritis
is somehow holding the language back". That's bad enough, even though
in its contextual misuse of "scalable" it breaks new ground, and I don't
want to waste even more time re-treading *old* ground as to whether the
"*insane* extensibility" afforded by macros is a good or a bad thing in
a language to be used for real-world software production (as opposed to
prototyping and research).

> """
> A friend of mine who knows nearly all the widely used languages uses
> Python for most of his projects. He says the main reason is that he
> likes the way source code looks.
No argument. The little Python I wrote while porting Cells to Python was
strikingly attractive. But it was a deal with the devil, unless Python
is content to be just a scripting language. (And it should be.)


It's hard to attribute feelings to a programming language, but, if you
really must, I'd say Pyton aspires to be *useful* -- if all you need is
"just a scripting language", it will be content to be one for you, and
if your need SCALE, well then, PYTHON IS SCALABLE, and will remain a
*SIMPLE, CLEAN, LITTLE AND POWERFUL LANGUAGE* (letting nobody do
anything INSANE to it;-) while scaling up to whatever size of project(s)
you need (including systems so large that they redefine the very concept
of "large scale" -- believe me, once in a while at a conference I make
the mistake of going to some talk about "large scale" this or that, and
invariably stagger out once again with the realization that what's
"large scale" to the world tends to be a neat toy-sized throwaway little
experiment to my current employer).


You haven't given much justification for the claim that Python is a
particularly "scalable" language. Sure, Google uses it, Graham gave
it some props somewhere in the middle of his notoriously pro-Lisp
writings, and even Norvig has said good things about it.

Fair enough. But what does Python offer above any garbage-collected
language that makes it so scalable?
OK, I propose a duel. We'll co-mentor this:

http://www.lispnyc.org/wiki.clp?page=PyCells

In the end Python will have a Silver Bullet, and only the syntax will
differ, because Python has a weak lambda, statements do not always
return values, it does not have macros, and I do not know if it has
special variables.

Then we can just eyeball the code and see if the difference is
interesting. These abstract discussions do tend to loop.


As a SummerOfCode mentor, I'm spoken for, and can't undertake to mentor
other projects. I do agree that these discussions can be sterile, and
I'll be glad to see what comes of your "pycells" project, but until then
there's little we can do except agree to disagree (save that I'd like
you to acknowledge my point above, regarding what exactly I had said and
the fact that your alleged counterexample doesn't address at all what I
had said -- but, I'll live even without such an acknowledgment).
Alex


--
This is a song that took me ten years to live and two years to write.
- Bob Dylan
May 7 '06 #48

P: n/a
al*****@yahoo.com (Alex Martelli) writes:
(|(| "your Lisp /is/ rusty.")
Interestingly, the SECOND lisper to prove himself unable to read the
very text he's quoting. Reread carefully, *USE THE ***SINGLE***
CHARACTER* ... *AS AN ORDINARY IDENTIFIER*. What makes you read a
``PART OF'' that I had never written? You've shown how to use the
characters as *PART* of an identifier [[and I believe it couldn't be the
very start]], and you appear to believe that this somehow refutes my
assertion?


The identifier there is a single paren. The vertical bars are used to
escape the paren, so that the reader doesn't get confused. The Pythonic
equivalent would be something like

\( = 5

where the backslash escapes the paren. In real Python you could say:

locals()['('] = 5

In Lisp you could get rid of the need to escape the paren if you
wanted, using suitable read macros. Whether that's a good idea is of
course a different matter.
Yes, we are, because the debate about why it's better for Python (as a
language used in real-world production systems, *SCALABLE* to extremely
large-scale ones) to *NOT* be insanely extensible and mutable is a
separate one -- Python's uniformity of style allows SCALABILITY of
teams, and teams-of-teams, which is as crucial in the real world ...
My current take on Lisp vs Python is pretty close to Peter Norvig's
(http://www.norvig.com/python-lisp.html):

Python has the philosophy of making sensible compromises that make
the easy things very easy, and don't preclude too many hard
things. In my opinion it does a very good job. The easy things are
easy, the harder things are progressively harder, and you tend not
to notice the inconsistencies. Lisp has the philosophy of making
fewer compromises: of providing a very powerful and totally
consistent core. This can make Lisp harder to learn because you
operate at a higher level of abstraction right from the start and
because you need to understand what you're doing, rather than just
relying on what feels or looks nice. But it also means that in
Lisp it is easier to add levels of abstraction and complexity;
Lisp makes the very hard things not too hard.
It's hard to attribute feelings to a programming language, but, if you
really must, I'd say Pyton aspires to be *useful* -- if all you need is
"just a scripting language", it will be content to be one for you, and
if your need SCALE, well then, PYTHON IS SCALABLE, and will remain a
*SIMPLE, CLEAN, LITTLE AND POWERFUL LANGUAGE* (letting nobody do
anything INSANE to it;-) while scaling up to whatever size of project(s)
you need (including systems so large that they redefine the very concept
of "large scale" -- believe me, once in a while at a conference I make
the mistake of going to some talk about "large scale" this or that, and
invariably stagger out once again with the realization that what's
"large scale" to the world tends to be a neat toy-sized throwaway little
experiment to my current employer).


I've heard many times that your current employer uses Python for all
kinds of internal tools; I hadn't heard that it was used in Very Large
projects over there. I'd be interested to hear how that's been
working out, since the biggest Python projects I'd heard of before
(e.g. Zope) are, as you say, toy-sized throwaways compared to the
stuff done regularly over there at G.
May 7 '06 #49

P: n/a
Bill Atkins <NO**********@rpi.edu> writes:
Fair enough. But what does Python offer above any garbage-collected
language that makes it so scalable?


I think what used to be Lisp culture now uses the *ML languages or
Haskell. It's only throwbacks (which includes me sometimes) who still
use Lisp. I've been wanting for a while to do something in ML but
just haven't worked up enough steam for it. Python really does make
small and medium-sized tasks easy, even compared with Lisp. I'm still
reserving judgement about how it is at large tasks.
May 7 '06 #50

267 Replies

This discussion thread is closed

Replies have been disabled for this discussion.