469,341 Members | 7,180 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,341 developers. It's quick & easy.

Python syntax in Lisp and Scheme

I think everyone who used Python will agree that its syntax is
the best thing going for it. It is very readable and easy
for everyone to learn. But, Python does not a have very good
macro capabilities, unfortunately. I'd like to know if it may
be possible to add a powerful macro system to Python, while
keeping its amazing syntax, and if it could be possible to
add Pythonistic syntax to Lisp or Scheme, while keeping all
of the functionality and convenience. If the answer is yes,
would many Python programmers switch to Lisp or Scheme if
they were offered identation-based syntax?
Jul 18 '05
699 28956
> Lisp doesn't let you do that, because it turns out to be a bad idea.
Only if you subscribe to the Java-mentality that power is bad. Operator
overloading is a must in languages that strive to make user-defined types
fully equal to built-in types (like Dylan, and Goo, maybe CLOS). Besides,
prohibiting operator-overloading is not orthogonal in languages with
generic dispatch, because it creates a pointless schism between built-in
operators and user-defined types.
When you go reading someone's program, what you really want is for
the standard operators to be doing the standard and completely
understood thing. Why? Usually, you cannot overload operators on built-in types. So an integer
plus an integer will always have a specific meaning. However a matrix plus
a matrix can have a user-defined meaning, assuming matricies are
user-defined types. Now, a programmer could go ahead and make an overload
of operator+ that actually subtracted matricies, but he could just as well
write a function "add-matrix" that actually subtracted matricies. And
assuming that the programmer isn't trying to lie to you, its much easier to
understand

(+ m1 m2 m3)

than

(add-matrix m1 m2 m3)

because '+' carries with it an expected set of semantics, while the reader
has to go to the definiton of add-matrix to fully understand its semantics.
Lisp has carefully defined those operations to do all the things
that are both well-understood (like complex numbers) and missing
from languages like C++.

Actually, C++ has a complex number class in its standard library. Why not in
the language proper? There is no need for it to be in the language, because
it can be implemented in the library just as easily. Its the same reason
'loop' is a macro and not built into the language. Besides, what do you do
when you need quaternions? Wait for the next version of the standard?
Jul 18 '05 #251
Kenny Tilton <kt*****@nyc.rr.com> writes:
I think Python's problem is its success. Whenever something is
succesful, the first thing people want is more features. Hell, that is
how you know it is a success. The BDFL still talks about simplicity,
but that is history. GvR, IMHO, should chased wish-listers away with
"use Lisp" and kept his gem small and simple.


That's silly. Something being successful means people want to use it
to get things done in the real world. At that point they start
needing the tools that other languages provide for dealing with the
real world. The real world is not a small and simple place, and small
simple systems are not always enough to cope with it. If GVR had kept
his gem small and simple, it would have remained an academic toy, and
I think he had wide-reaching ambitions than that.
Jul 18 '05 #252

"Dave Benjamin" <da**@3dex.com> wrote in message
news:u%************@news1.central.cox.net...
Yeah, wasn't something like that up on ASPN? That's an interesting
trick... are you sure it's not supposed to be "property(*aprop())"
though? (who's being pedantic now? =)


Hi.

The idiom/recipe is at
http://aspn.activestate.com/ASPN/Coo.../Recipe/205183

Sean
Jul 18 '05 #253
> sure, but it seems like noone was able to let Cconceivable(virtual) inner
classes,
That's because Lisp has closures, and because CL doesn't mandate access
protections for classes.
methods inside methods, You can use a lambda to accomplish the same thing.
virtual methods (yeah I know about those stupid generic functions :), Generic functions are just virtual methods generalized to multiple dispatch.
method overloading Method overloading is just a special case of generic dispatch in situations
where the types of the dispatch arguments are known at compile-time. In
situations where method overloading could be applicable, GF dispatch
doesn't even have a performance hit over static method overloading because
the compiler can optimize-out the generic dispatch.
A decent API (I tried playing with it.. it doesn't even have a freaking
date library as standard ;-p Google for one! What's the point of having every conceivable library in the
language?

Yes I agree with the compile time macro expansion is a nice thing.
However, if I want to do some serious changes to the structure of objects
and classes (i.e. create a new kind of objects) then I have to spend a
long time finding out how the CLOS people hacked together their
representation of classes, methods, method call etc...

There is a MOP provided expressly for this purpose.
Jul 18 '05 #254
Sean Ross wrote:
Yeah, wasn't something like that up on ASPN? That's an interesting
trick... are you sure it's not supposed to be "property(*aprop())"
though? (who's being pedantic now? =)


Hi.

The idiom/recipe is at
http://aspn.activestate.com/ASPN/Coo.../Recipe/205183


Thanks, Sean.

Jul 18 '05 #255
In comp.lang.lisp Dave Benjamin <da**@3dex.com> wrote:
Karl A. Krueger wrote:
This seems like a good juncture to post my list of common myths and
misconceptions about popular programming languages. Contributions are
welcome; flames only if they're funny. Anyone who needs to see :) on
things to know they're meant in jest should stop reading now.


Haha... that's really funny... except the last one. Not that I'm a
Python purist (a big fan, yes, but not a purist), but I rarely complain
about its slowness. Java is too easy of a target for that one... =)


*laugh* I use Python more often in my job than I use Lisp, Perl, or any
other language, except possibly the Unix shell. It really is not among
the speedier ones for a lot of tasks. (Neither is Java, but I have
thankfully avoided having to do any real work in Java.) Python's
strength, for the kind of projects I work on in it, is its regularity
and the extensive standard library for things like talking to network
applications.

I moved to Python for such things from Perl, after realizing that I
really did not want to implement a database-backed application in a
language that required messy explicit dereferencing when dealing with
complex data structures. I'd much rather deal with a list of lists of
tuples than a list of references to lists of references to tuples,
getting back things like "ARRAY<#fhqwhgads>" when I missed a dereference
character.

But you're taking my list of myths too seriously. Not all C programs
are riddled with security bugs either. :)

Incidentally, I regard objections to "the whitespace thing" in Python
and objections to "the parenthesis thing" in Lisp as more or less the
same. People who raise these objections are usually just saying "Ick!
This looks so unfamiliar to me!" in the language of rationalizations.
I guess a philosopher would say that I am an emotivist about notation
criticisms.

--
Karl A. Krueger <kk******@example.edu>
Woods Hole Oceanographic Institution
Email address is spamtrapped. s/example/whoi/
"Outlook not so good." -- Magic 8-Ball Software Reviews
Jul 18 '05 #256
Dave Benjamin wrote:
Mike Rovner wrote:
Unnamed code blocks considered evil :), use named instead
(functions).
Why are they evil? Does being anonymous automatically make you evil?

For instance, I always thought this was a cooler alternative to the
try/finally block to ensure that a file gets closed (I'll try not to
mess up this time... ;) :

open('input.txt', { |f|
do_something_with(f)
do_something_else_with(f)
})

Rather than:

f = open('input.txt')
try:
do_something_with(f)
do_something_else_with(f)
finally:
f.close()


"Explicit is better than implicit"

Even your example clearly shows that try block is much more readable and
understandable.
That's why it's being considered evil by majority of python developers.
But the anonymous version still looks more concise to me.
Python prioritize things diferently than other languages.
It's not an APL. "Readability counts"
Yeah, wasn't something like that up on ASPN? That's an interesting
trick... are you sure it's not supposed to be "property(*aprop())"
though? (who's being pedantic now? =)


Yeah, right. Thanks for pointing out.

Mike


Jul 18 '05 #257
Pascal Costanza:
He provides more information at http://www.paulgraham.com/icad.html


The page I referenced (http://www.paulgraham.com/power.html)
appears to be a refinement of some ideas on that page.

It's somewhat funny that on the page I mentioned he says:

So any language comparison where you have to meet a predefined
spec is testing slightly the wrong thing.

while on the page you reference he gives a comparison between various
languages based on a predefined spec. And he uses as his test case
something I've not needed. By comparison, following Kenny Tilton's
lead, the Bagley Shootout Page suggests that in an aggregate of small
but non-trivial problems, OCaml is the most succinct language, with
Python roughly comparable to Lisp/Scheme.

I have run across his pages before, and have a hard time
symphathizing with his view of things. For example, the start of
the icad essay mentions that Lisp is already "kind of unusual"
compared to C because it includes a full interpreter. But
effectively all Python programs shipped include a full interpreter
as well, and many give access to that interpreter, so I don't
see what's unusual about it. Ditto for Tcl apps. Even some of
my command-line perl apps included a way to pass in perl
code on the command line, as for a search filter.

The phrase "they had hard-headed engineering reasons for
making the syntax look so strange." reminds me of the statement
"better first rate salespeople and second rate engineers than
second rate salespeople and first rate engineers" (and better
first rate both). That's saying *nothing* about the languages;
it's saying that his viewpoint seems to exclude the idea that
there are hard-headed non-engineering reasons for doing things."

Consider one of those "hard-headed engineering reasons", at
http://www.paulgraham.com/popular.html

It has sometimes been said that Lisp should use first and
rest instead of car and cdr, because it would make programs
easier to read. Maybe for the first couple hours. But a hacker
can learn quickly enough that car means the first element
of a list and cdr means the rest. Using first and rest means
50% more typing. And they are also different lengths, meaning
that the arguments won't line up when they're called,

That to me is a solid case of post hoc ergo proper. The
words "1st" and "rst" are equally as short and easier to
memorize. And if terseness were very important, then
what about using "." for car and ">" for cdr? No, the reason
is that that's the way it started and it will stay that way
because of network effects -- is that a solid engineering
reason? Well, it depends, but my guess is that he wouldn't
weight strongly the impact of social behaviours as part of
good engineering. I do.

And entirely off the topic of programming, his essay at
http://www.paulgraham.com/nerds.html
has little resonance with my memory of high school.

Andrew
da***@dalkescientific.com
Jul 18 '05 #258
On Thu, 9 Oct 2003, Andrew Dalke wrote:
Consider one of those "hard-headed engineering reasons", at
http://www.paulgraham.com/popular.html

It has sometimes been said that Lisp should use first and
rest instead of car and cdr, because it would make programs
easier to read. Maybe for the first couple hours. But a hacker
can learn quickly enough that car means the first element
of a list and cdr means the rest. Using first and rest means
50% more typing. And they are also different lengths, meaning
that the arguments won't line up when they're called,

That to me is a solid case of post hoc ergo proper. The
words "1st" and "rst" are equally as short and easier to
memorize. And if terseness were very important, then
what about using "." for car and ">" for cdr? No, the reason
is that that's the way it started and it will stay that way
because of network effects -- is that a solid engineering
reason? Well, it depends, but my guess is that he wouldn't
weight strongly the impact of social behaviours as part of
good engineering. I do.


It hasn't stayed that way for me:

(define first car)
(define rest cdr)

:)

- Daniel
Jul 18 '05 #259
Mike Rovner wrote:
Dave Benjamin wrote:
For instance, I always thought this was a cooler alternative to the
try/finally block to ensure that a file gets closed (I'll try not to
mess up this time... ;) :

open('input.txt', { |f|
do_something_with(f)
do_something_else_with(f)
})

Rather than:

f = open('input.txt')
try:
do_something_with(f)
do_something_else_with(f)
finally:
f.close()
"Explicit is better than implicit"


In that case, why do we eschew code blocks, yet have no problem with the
implicit invocation of an iterator, as in:

for line in file('input.txt'):
do_something_with(line)

This is not to say that I dislike that behavior; in fact, I find it
*beneficial* that the manner of looping is *implicit* because you can
substitute a generator for a sequence without changing the usage. But
there's little readability difference, IMHO, between that and:

file('input.txt').each_line({ |line|
do_something_with(line)
})

Plus, the first example is only obvious because I called my iteration
variable "line", and because this behavior is already widely known. What
if I wrote:

for byte in file('input.dat'):
do_something_with(byte)

That would be a bit misleading, no? But the mistake isn't obvious. OTOH,
in the more explicit (in this case) Ruby language, it would look silly:

open('input.txt').each_line { |byte|
# huh? why a byte? we said each_line!
}

I think this is important to point out, because the implicit/explicit
rule comes up all the time, yet Python is implicit about lots of things!
To name a few:

- for loops and iterators
- types of variables
- dispatching via subclass polymorphism
- coercion (int->float, int->long...)
- exceptions (in contrast with Java's checked exceptions)
- __magic_methods__
- metaclasses
- nested scopes (compared to yesteryear's lambda x, y=y, z=z: ...)
- list comprehensions

In all of the above cases (with a bit of hesitation toward the voodoo of
metaclasses) I think Python is a better language for it. On the other
hand, Perl's implicit $_ variable is a good example of the hazards of
implicitness; that can be downright confusing. So, it's not cut and dry
by any means.

If all you're saying is that naming something is better than not naming
something because explicit is better than implicit, I'd have to ask why:

a = 5
b = 6
c = 7
d = a + b
e = c / 2
result = d + e
return result

Is any better than:

....
return (a + b) + (c / 2)

To me, it's the same issue. Why should I have to name something that I'm
just going to return in the next statement, or pass as a parameter, and
then be done with it? Does that really increase either readability or
understandability? Why should I name something that I'm not going to ask
for later?
Even your example clearly shows that try block is much more readable and
understandable.
That's why it's being considered evil by majority of python developers.


Readability is a moving target. I think that the code block syntax
strikes a nice balance between readability and expressiveness. As far as
what the majority of Python developers consider evil, I don't think
we've got the stats back on that one.
But the anonymous version still looks more concise to me.


Python prioritize things diferently than other languages.
It's not an APL. "Readability counts"


This is nothing like APL... if anything, it's like Smalltalk, a language
designed to be readable by children! I realize that APL sacrificed
readability for expressiveness to an uncomfortable extreme, but I really
think you're comparing apples and oranges here. List comprehensions are
closer to APL than code blocks.

Dave

Jul 18 '05 #260
In article <ma**********************************@python.org >, Lulu of the Lotus-Eaters wrote:
Dave Benjamin <da**@3dex.com> wrote previously:
|return { |x, y|
| print x
| print y
|}
|It's unambiguous because no dictionary literal would ever start with
|'{|', it looks almost identical to a certain other language <g>

Btw. I think Dave is thinking of Ruby as that "certain other language."
But Clipper/xBase used the same syntax for the same thing before Ruby
was a glimmer in Matz' eye. I'm not sure if that's where he got it
though... it might be from somewhere older I don't know about.


Busted. =) I could be wrong, but I thought Ruby got its code blocks from
Smalltalk, since both languages support them, and both have collections that
support "collect", "select", and "invoke", aka "map", "filter", and
"reduce", using code blocks. The syntax isn't exactly the same, but it's
very similar.

My coworker used to be a Clipper programmer, and he gets a sparkle in his
eye when he reminisces about the code blocks of old.

Dave

--
..:[ dave benjamin (ramenboy) -:- www.ramenfest.com -:- www.3dex.com ]:.
: d r i n k i n g l i f e o u t o f t h e c o n t a i n e r :
Jul 18 '05 #261


Andrew Dalke wrote:
pr***********@comcast.net:
So either the syntax doesn't make a whole hell of a lot of difference
in readability, or readability doesn't make a whole hell of a lot of
difference in utility.

Or the people who prefer the awesome power that is Lisp and
Scheme don't find the limited syntax to be a problem.


OK, here you are saying that only a certain subset of programmers have
the genetic makeup necessary to prefer the unlimited syntax. Unlikely.

It is nice that you also say this strongly correlates with those who
like their programming languages to be powerful. But my guess is that
that genetic thing won't hold up: if everyone who tries Lisp for more
than two weeks gets used to the parens, and then after two months would
not want to edit any other way, then what makes me think anyone who
likes programming (even the ones in your imagination who do not want
powerful languages) would love the syntax.

({function | macro | special-form} arguments*) => values*

Roughly speaking (I'm no BNFer) but... hellasweet! You can chant
"simplicity" as much as you like, but /that/ is simple.

In C i put "int x;" at the top level and I have a global. In Lisp I am
at first astonished to see I have to type (defparameter x) or defvar or
defconstant. Wordy! Wordy! Wordy! But eventually I realize. Oh, why is
the C "x" a global? There is no difference between that declaration and
the one in:

void zzzz () {
int x;
.....}

Oh, well, you see, it is at the top level. ie, Weird things like a form
being at the top level have huge unspoken implications. But with Lisp
there is a great honking special form (macro?) such as "defparameter"
that grabs you by the lapels and screams "GLOBAL!!!". But most of all,
it manages to get the job done with The One True Syntax, using a
dedicated macro instead of a special syntax (top-levelness) to establish
the semantics.

You call that "limited"? Yer just flaming.

Anyway, the point is: fuggedaboutit. Lisp syntax is unlike all the other
languages you have ever used, so it seems bizarre to the unitiated. But
go to c.l.l. and flame the syntax and you will discover it is something
Lispniks love. Including those who have used and excelled at all the
other languages. And, as you concede, these are people who groove on the
power of Lisp. Doesn't that make folks think they better go check out
for themselves what these folks have discovered?

http://alu.cliki.net/The%20RtLS%20by%20Road

Don't forget your Lisp-aware editor.
--
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey

Jul 18 '05 #262
Andrew Dalke:
The phrase "they had hard-headed engineering reasons for
making the syntax look so strange." reminds me of the statement
"better first rate salespeople and second rate engineers than
second rate salespeople and first rate engineers" (and better
first rate both). That's saying *nothing* about the languages;
it's saying that his viewpoint seems to exclude the idea that
there are hard-headed non-engineering reasons for doing things."

Consider one of those "hard-headed engineering reasons", at
http://www.paulgraham.com/popular.html

It has sometimes been said that Lisp should use first and
rest instead of car and cdr, because it would make programs
easier to read. Maybe for the first couple hours. But a hacker
can learn quickly enough that car means the first element
of a list and cdr means the rest. Using first and rest means
50% more typing. And they are also different lengths, meaning
that the arguments won't line up when they're called,

That to me is a solid case of post hoc ergo proper. The
words "1st" and "rst" are equally as short and easier to
memorize. And if terseness were very important, then
what about using "." for car and ">" for cdr? No, the reason
is that that's the way it started and it will stay that way
because of network effects -- is that a solid engineering
reason? Well, it depends, but my guess is that he wouldn't
weight strongly the impact of social behaviours as part of
good engineering. I do.


It's pretty funny when you consider that car and cdr were named after the
Contents of Address Register and Contents of Decrement Register on the IBM
704. Now that's a solid engineering reason!

(I'm not knocking Lisp; in fact, this discussion has whetted my appetite to
explore it.)

-Mike
Jul 18 '05 #263
Michael Geary wrote:
Andrew Dalke:
The phrase "they had hard-headed engineering reasons for
making the syntax look so strange." reminds me of the statement
"better first rate salespeople and second rate engineers than
second rate salespeople and first rate engineers" (and better
first rate both). That's saying *nothing* about the languages;
it's saying that his viewpoint seems to exclude the idea that
there are hard-headed non-engineering reasons for doing things."

Consider one of those "hard-headed engineering reasons", at
http://www.paulgraham.com/popular.html

It has sometimes been said that Lisp should use first and
rest instead of car and cdr, because it would make programs
easier to read. Maybe for the first couple hours. But a hacker
can learn quickly enough that car means the first element
of a list and cdr means the rest. Using first and rest means
50% more typing. And they are also different lengths, meaning
that the arguments won't line up when they're called,

That to me is a solid case of post hoc ergo proper. The
words "1st" and "rst" are equally as short and easier to
memorize. And if terseness were very important, then
what about using "." for car and ">" for cdr? No, the reason
is that that's the way it started and it will stay that way
because of network effects -- is that a solid engineering
reason? Well, it depends, but my guess is that he wouldn't
weight strongly the impact of social behaviours as part of
good engineering. I do.

It's pretty funny when you consider that car and cdr were named after the
Contents of Address Register and Contents of Decrement Register on the IBM
704. Now that's a solid engineering reason!

(I'm not knocking Lisp; in fact, this discussion has whetted my appetite to
explore it.)

-Mike

Graham does admit in that the reasons for the choice were mostly
historical. However, he uses them because he likes the fact that they
are shorter than first and rest.

If you read his design goals for Arc you will note that he is a big fan
of very terse operators.

--
Doug Tolton
(format t "~a@~a~a.~a" "dtolton" "ya" "hoo" "com")

Jul 18 '05 #264
Me:
Or the people who prefer the awesome power that is Lisp and
Scheme don't find the limited syntax to be a problem.

Kenny Tilton: OK, here you are saying that only a certain subset of programmers have
the genetic makeup necessary to prefer the unlimited syntax. Unlikely.
Yes, it is very unlikely that I would say that...

Oh wait. I didn't.

I prefer speaking in English even if Navajo might be a better language
for expressing certain concepts. That doesn't mean that preference
is part of my genetic makeup.

I prefer the mambo basic over the salsa basic when dancing.
Doesn't mean that's part of my genetic makeup.
({function | macro | special-form} arguments*) => values*

Roughly speaking (I'm no BNFer) but... hellasweet! You can chant
"simplicity" as much as you like, but /that/ is simple.
Actually, you could have a more minimal language with

(name arguments*) => values*

but Lispers must have decided that *some* syntax makes
things easier.

And you can get still more minimal with FORTH where the
syntax is

WORD *
But with Lisp
there is a great honking special form (macro?) such as "defparameter"
that grabs you by the lapels and screams "GLOBAL!!!".
The word "global" would scream "GLOBAL!!!" a bit more.
But most of all,
it manages to get the job done with The One True Syntax, using a
TOTS? If there was no need for backwards compatibility, I'll
argue that swapping [] and () would be better, since I don't need
to use shift to use the [ or ] characters. (On a US keyboard. And
yes, I know I could swap the keyboard layout.) Assuming there
was no other code in the world and no other Lisp programmers,
wouldn't that be a better syntax?
You call that "limited"? Yer just flaming.
Yes, I'm calling it limited. I didn't say limiting, which appears to
be what you read.

No, I'm not flaming.
But go to c.l.l. and flame the syntax and you will discover it is something Lispniks love.
In case you hadn't noticed, this is cross posted on c.l.l.

I am fully aware of M-expressions for Lisp. I know about
Dylan and it's Lispishness in a infix language. I know that
Lispers enjoy their syntax. I know it's the source of the ability
to support macros and other on-the-fly code manipulation.

I am not asking any of them -- nary a soul -- to abandon Lisp.
Nor am I flaming the syntax.

What I said was that Python is *not* an application of
Greespun's Tenth Rule of programming because 1) it isn't
bug-ridden, and 2) because Python explores ideas which
which had no influence on Lisp's development -- user
studies of non-professional programmers.

Where are the user studies which suggested () over [], or that
"car" is better than "first"/"1st" or that "cdr" is better than
"rest"/"rst"?

Yes, I know that the early teletypes might not have had
[ or ], and that car and cdr come from register names on
the machine Lisp was first implemented on. If that's
indeed the justification then there may be a Lisp-ish language
which is equally as powerful, equally as elegant, etc *and*
which is slightly easier to learn and type. But it wasn't chosen,
and it won't be used because of good social reasons: a huge
existing code base and people who now have Lisp "in their
fingers" and don't want to retrain for the slight advantage
that others might get.

"The One True Syntax" indeed.

(Okay, I admit it. That one line was flaming *you*, but not
c.l.l'ers as a class.)
And, as you concede, these are people who groove on the
power of Lisp. Doesn't that make folks think they better go check out
for themselves what these folks have discovered?


Sure. More power to them. For me, it looks like my
next really different language to learn might be OCaml. ("really
different" because I needed to learn some Javascript recently,
and I don't find it different enough to give a different view of
the world.)

Andrew
da***@dalkescientific.com
Jul 18 '05 #265
Andrew Dalke wrote:
[...]
What I said was that Python is *not* an application of
Greespun's Tenth Rule of programming because 1) it isn't
bug-ridden, and 2) because Python explores ideas which
which had no influence on Lisp's development -- user
studies of non-professional programmers.
Do you know where I can find those studies? I'm very intested in their
findings :)

By the way, what's a non-professional programmer?
Where are the user studies which suggested () over [], or that
"car" is better than "first"/"1st" or that "cdr" is better than
"rest"/"rst"?

Yes, I know that the early teletypes might not have had
[ or ], and that car and cdr come from register names on
the machine Lisp was first implemented on. If that's
indeed the justification then there may be a Lisp-ish language
which is equally as powerful, equally as elegant, etc *and*
which is slightly easier to learn and type. But it wasn't chosen,
and it won't be used because of good social reasons: a huge
existing code base and people who now have Lisp "in their
fingers" and don't want to retrain for the slight advantage
that others might get.
Well, if you count scheme as a lisp...

Welcome to DrScheme, version 205.3-cvs1oct2003.
Language: Pretty Big (includes MrEd and Advanced). [first[list 1 2 3 '[4 5]]]

1

- Daniel
Jul 18 '05 #266
"Erann Gat" <my************************@jpl.nasa.gov> wrote in message
news:my*****************************************@k-137-79-50-101.jpl.nasa.go
v...
In article <xc*************@famine.OCF.Berkeley.EDU>,
tf*@famine.OCF.Berkeley.EDU (Thomas F. Burdick) wrote:
method overloading,


How could you have both noncongruent argument lists, and multiple
dispatch?


C++ seems to manage it somehow.

#include <stdio.h>

void foo(int x, int y) { printf("1\n"); }
void foo(double x, int y) { printf("2\n"); }
void foo(char* x) { printf("3\n"); }

main() {
foo(1,2);
foo(1.2,2);
foo("foo");
}

compiles and runs without complaint.

E.


Ahh, but overloading only works at compile time:

void foo( SomeBaseObject* object );
void foo( SomeDerivedObject* object );

doesn't work if you're using a base class pointer for all your derived
classes.

Mike
Jul 18 '05 #267
Doug Tolton:
Graham does admit in that the reasons for the choice were mostly
historical. However, he uses them because he likes the fact that they
are shorter than first and rest.
Not in that essay I referenced. And I deliberately mentioned
"1st" and "rst" as alternatives to "car" and "cdr" which are exactly
the same length and are easier to remember. The fact that "first"
and "rest" are longer doesn't immediately mean that there are
no other viable alternatives.

BTW, paulgraham.com/arcfaq.html says that car/cdr remain
because they are composable, as in "cadr". Is that the same
as 2nd?

Ahh, the FAQ also says that [ and ] are "less directional"
than ( and ), which I can understand. I don't understand
the objection with < and > ; they "lose because they don't
wrap around enough to enclose expressions long than tokens."
That makes no sense to me. Is is that they aren't tall enough?

Couldn't a good development environment depict the delimiters
as, say Unicode characters 27E8 and 27E9?
http://www.unicode.org/charts/PDF/U27C0.pdf
Those look like large "<" and ">"

Or is there a requirement that it be constrained to display
systems which can only show ASCII? (Just like a good
Lisp editor almost requires the ability to reposition a
cursor to blink on matching open parens. Granted, that
technology is a few decades old now while Unicode isn't,
but why restrict a new language to the display systems
of the past instead of the present?)

Heh-heh. "Why not build Arc on top of Java/Parrot/.NET?"
"We're trying to make something for the long term in Arc,
something that will be useful to people in, say, 100 years."

Then build it on MMIX! :)
If you read his design goals for Arc you will note that he is a big fan
of very terse operators.


Indeed. It looks easier to understand to my untrained eye.
I disagree that "+" shouldn't work on strings because that
operation isn't commutative -- commutativity isn't a feature
of + it's a feature of + on a certain type of set.

He says that "programmers will be able to declare that
strings should be represented as traditional sequences of bytes."
which leads me to wonder about its Unicode support.

What's unicode support like in general for Lisp? Found an
answer in http://www.cliki.net/Unicode%20Support Digging
some more, it looks like CLisp uses .. UCS-4 and Unicode 3.2
(from clisp.cons.org). But do regexps work on unicode strings?
How portable are unicode strings? I figure they must be in
order to handle XML well. ... "ACL does not support 4 byte
Unicode scalar values" says franz.com. www.cl-xml.org says
"The processor passes 1749 of the 1812 tests .. when the base
implementation supports sixteen-bit charactrs." and that
MCL, LispWorks and the Allegro 5.0.1 international version
support 16-bit Unicode while Allegro ascii only supports 8bit.
So some have UCS-2 and some UCS-4.

Is there consensus on the Unicode API?

On the XML path, I found cl-xml. Looking at the bugs section in
http://pws.prserv.net/James.Anderson...on/cl-xml.html
It says "the implementation form for HTTP support is determined
at compilation time." Is it really true that even HTTP handling is
different on the different implementations?

And the section under "porting" is .. strange. It looks like to
use the XML API for a given Lisp I need to know enough
about the given implementation to configure various settings,
so if I wanted to learn Lisp by writing, say, a simple XML-RPC
client then I have to learn first what it means "to complete
definitions in the following files" and the details of "defsystem",
"package", "STREAM-READER / -WRITER", etc.

That reminds me of my confusion testing a biolisp package.
I needed to edit the file before it worked; something to do
with commenting/uncommenting the right way to handle
packages. I prefer to start with working code.

Andrew
da***@dalkescientific.com
Jul 18 '05 #268
Daniel P. M. Silva:
Do you know where I can find those studies? I'm very intested in their
findings :)
Sure. The research was done for ABC. ABC's home page is
http://homepages.cwi.nl/~steven/abc/
ABC is an interactive programming language and environment for
personal computing, originally intended as a good replacement for
BASIC. It was designed by first doing a task analysis of the
programming task.

There's a publication list at
http://homepages.cwi.nl/~steven/abc/publications.html

Guido, the author of Python, was involved in that project. For his
commentary on ABC's influence on Python see:
http://www.python.org/doc/essays/foreword.html
By the way, what's a non-professional programmer?
The people I generally work for. Research scientists,
usually computational chemists and computational biologists,
who need to write code but don't consider themselves to be
software developers and haven't had more than a semester
or two of formal training and would rather do more science
then spend time reading books on language practice or
theory, even if by doing so it made them more productive
in the long run.
Welcome to DrScheme, version 205.3-cvs1oct2003.
Language: Pretty Big (includes MrEd and Advanced).
[first[list 1 2 3 '[4 5]]]

1


Indeed? Well I just found a mention on Paul Graham's site
that he excluded [] over () because it didn't provide enough
directionality.

Again, where's the studies? :)

Andrew
da***@dalkescientific.com
Jul 18 '05 #269
In article <kx*****************@newsread4.news.pas.earthlink. net>, Andrew Dalke wrote:
Daniel P. M. Silva:
Do you know where I can find those studies? I'm very intested in their
findings :)


Sure. The research was done for ABC. ABC's home page is
http://homepages.cwi.nl/~steven/abc/
ABC is an interactive programming language and environment for
personal computing, originally intended as a good replacement for
BASIC. It was designed by first doing a task analysis of the
programming task.


Interestingly enough:

"The language is strongly-typed, but without declarations. Types are
determined from context."
- http://ftp.cwi.nl/abc/abc.intro

Sounds like type inference to me.

Also:

"There is no GOTO statement in ABC, and expressions do not have
side-effects."
- http://homepages.cwi.nl/~steven/abc/teaching.html

Hints both at the statement/expression dichotomy of Python and the issue
that side-effects make it difficult to reason about a program, one of the
most important assertions made by functional proponents (IMHO).

Dave

--
..:[ dave benjamin (ramenboy) -:- www.ramenfest.com -:- www.3dex.com ]:.
: d r i n k i n g l i f e o u t o f t h e c o n t a i n e r :
Jul 18 '05 #270
Dave Benjamin:
Interestingly enough:

"The language is strongly-typed, but without declarations. Types are
determined from context."
- http://ftp.cwi.nl/abc/abc.intro

Sounds like type inference to me.
Sound like dynamic typing to me. Python is strongly-typed
but without declarations, and the type is determined as needed.
But I don't know enough about ABC to authoritatively declare
that it does/does not do type inferencing. My guess is that it
does not.
Also:

"There is no GOTO statement in ABC, and expressions do not have
side-effects."
- http://homepages.cwi.nl/~steven/abc/teaching.html

Hints both at the statement/expression dichotomy of Python and the issue
that side-effects make it difficult to reason about a program, one of the
most important assertions made by functional proponents (IMHO).


I think you're reading too much into it. The example code
doesn't look at all functional to me, as in (from the main page)

HOW TO RETURN words document:
PUT {} IN collection
FOR line IN document:
FOR word IN split line:
IF word not.in collection:
INSERT word IN collection
RETURN collection

It looks like 'line' and 'word' can take on many values,
which a sure sign of something other than fp.

Andrew
da***@dalkescientific.com
Jul 18 '05 #271
cs****@dtpq.com (Christopher C. Stacy) writes:
He probably means "operator overloading" -- in languages where
there is a difference between built-in operators and functions,
their OOP features let them put methods on things like "+".
[...]
And in Lisp if you want to do some
other kind of arithmetic, you must make up your names for those
operators. This is considered to be a good feature.


In comp.lang.lisp there was recently a thread discussing why not all
CL-types were also CL-classes and all functions CLOS-methods (so that
operator overloading would be possible). I think the outcome was more
or less "it happened by historic accident and it's easier to write
fast compilers then". In general, taking away flexibility from the
programmer is not in the spirit of Lisp, though.
Jul 18 '05 #272
Andrew Dalke wrote:
Pascal Costanza:
He provides more information at http://www.paulgraham.com/icad.html

I have run across his pages before, and have a hard time
symphathizing with his view of things. For example, the start of
the icad essay mentions that Lisp is already "kind of unusual"
compared to C because it includes a full interpreter. But
effectively all Python programs shipped include a full interpreter
as well, and many give access to that interpreter, so I don't
see what's unusual about it. Ditto for Tcl apps. Even some of
my command-line perl apps included a way to pass in perl
code on the command line, as for a search filter.
I guess this reflects his experiences when he has learned Lisp in the
beginning of the 80's (AFAIK).

Yes, scripting languages have caught up in this regard. (However, note
that Common Lisp also includes a full compiler at runtime.)
The phrase "they had hard-headed engineering reasons for
making the syntax look so strange." reminds me of the statement
"better first rate salespeople and second rate engineers than
second rate salespeople and first rate engineers" (and better
first rate both). That's saying *nothing* about the languages;
it's saying that his viewpoint seems to exclude the idea that
there are hard-headed non-engineering reasons for doing things."
No, that's not a logical conclusion.
Consider one of those "hard-headed engineering reasons", at
http://www.paulgraham.com/popular.html

It has sometimes been said that Lisp should use first and
rest instead of car and cdr, because it would make programs
easier to read. Maybe for the first couple hours. But a hacker
can learn quickly enough that car means the first element
of a list and cdr means the rest. Using first and rest means
50% more typing. And they are also different lengths, meaning
that the arguments won't line up when they're called,

That to me is a solid case of post hoc ergo proper. The
words "1st" and "rst" are equally as short and easier to
memorize. And if terseness were very important, then
what about using "." for car and ">" for cdr? No, the reason
is that that's the way it started and it will stay that way
because of network effects -- is that a solid engineering
reason? Well, it depends, but my guess is that he wouldn't
weight strongly the impact of social behaviours as part of
good engineering. I do.


As you have already noted in another note, car and cdr can be composed.
cadr is the second element, caddr is the third, cadddr is the fourth,
and so on. cddr is the rest after the second element, cdddr is the rest
after the third element, and so on. Other abbreviations I have used
relatively often are caar, cdar, cadar.

These abbreviations seem strange to a Lisp outsider, but they are very
convenient, and they are easy to read once you have gotten used to them.
You don't actually "count" the elements in your head every time you see
these operators, but they rather become patterns that you recognize in
one go.

I don't know how this could be done with 1st, rst or hd, tl respectively.

Of course, Common Lisp also provides first, second, third, and so on, up
to ninth, and rest. It also provides nth with (nth 0 l) = (car l), (nth
1 l) = (cadr l), and so on and (nthcdr 0 l) = l, (nthcdr 1 l) = (cdr l),
(nthcdr 2 l) = (cddr l) and so on.

Pick your choice. "There is not only one way to do it." (tm)

The learning curve is steeper, but in the long run you become much more
productive.

Pascal

--
Pascal Costanza University of Bonn
mailto:co******@web.de Institute of Computer Science III
http://www.pascalcostanza.de Römerstr. 164, D-53117 Bonn (Germany)

Jul 18 '05 #273
In article <79****************@newsread4.news.pas.earthlink.n et>, Andrew Dalke wrote:
Dave Benjamin:
Interestingly enough:

"The language is strongly-typed, but without declarations. Types are
determined from context."
- http://ftp.cwi.nl/abc/abc.intro

Sounds like type inference to me.


Sound like dynamic typing to me. Python is strongly-typed
but without declarations, and the type is determined as needed.
But I don't know enough about ABC to authoritatively declare
that it does/does not do type inferencing. My guess is that it
does not.


Yeah, I'm sure you're right. Even though I've made the argument against
confusing static and strong typing myself many times, I still got caught off
guard myself. Doesn't "determined from context" sound a little different
from dynamic typing, though? I mean, to me, it reads like:

We don't declare types, ie.:
int i = 5
Instead, we determine them from context:
i = 5

What has the type, according to that language? The "i" or the "5"? How is
the type of "5" determined from context? Shouldn't it be "int", regardless of
context?
Also:

"There is no GOTO statement in ABC, and expressions do not have
side-effects."
- http://homepages.cwi.nl/~steven/abc/teaching.html

Hints both at the statement/expression dichotomy of Python and the issue
that side-effects make it difficult to reason about a program, one of the
most important assertions made by functional proponents (IMHO).


I think you're reading too much into it. The example code
doesn't look at all functional to me, as in (from the main page)
...


Nah, I think you're reading too much into my comment. I was just making an
observation. I don't think ABC is an FPL by a mile, from what I've read.

However, I *am* interested in things that people seem to value despite the
fact that they solve problems in sometimes radically different ways. Maybe
you don't see it, but I definitely see some parallels between the
idea of separating statements from expressions and the idea of separating
the imperative, mutating, side-effectful code from the immutable, declarative
functional, query-oriented, side-effect free.

I think there is a greater point to be made about all of this, and it has
something to do with time and change.

Dave

--
..:[ dave benjamin (ramenboy) -:- www.ramenfest.com -:- www.3dex.com ]:.
: d r i n k i n g l i f e o u t o f t h e c o n t a i n e r :
Jul 18 '05 #274
Andrew Dalke wrote:
If you read his design goals for Arc you will note that he is a big fan
of very terse operators.
Indeed. It looks easier to understand to my untrained eye.
I disagree that "+" shouldn't work on strings because that
operation isn't commutative -- commutativity isn't a feature
of + it's a feature of + on a certain type of set.


So what's the result of ("one" - "two") then? ;)
He says that "programmers will be able to declare that
strings should be represented as traditional sequences of bytes."
which leads me to wonder about its Unicode support.
It's a myth that bytes are restricted to 8 bits. See
http://www.wikipedia.org/wiki/Byte
What's unicode support like in general for Lisp? Found an
answer in http://www.cliki.net/Unicode%20Support Digging
some more, it looks like CLisp uses .. UCS-4 and Unicode 3.2
(from clisp.cons.org). But do regexps work on unicode strings?
How portable are unicode strings? I figure they must be in
order to handle XML well. ... "ACL does not support 4 byte
Unicode scalar values" says franz.com. www.cl-xml.org says
"The processor passes 1749 of the 1812 tests .. when the base
implementation supports sixteen-bit charactrs." and that
MCL, LispWorks and the Allegro 5.0.1 international version
support 16-bit Unicode while Allegro ascii only supports 8bit.
So some have UCS-2 and some UCS-4.

Is there consensus on the Unicode API?
No, not yet. ANSI CL was finalized in 1994.
On the XML path, I found cl-xml. Looking at the bugs section in
http://pws.prserv.net/James.Anderson...on/cl-xml.html
It says "the implementation form for HTTP support is determined
at compilation time." Is it really true that even HTTP handling is
different on the different implementations?
Again, not part of ANSI CL. Don't judge a standardized language with the
measures of a single-vendor language - that's a different subject.
(Apart from that, Jython also doesn't provide everything that Python
provides, right?)

Pick the one Common Lisp implementation that provides the stuff you
need. If no Common Lisp implementation provides all the stuff you need,
write your own libraries or pick a different language. It's as simple as
that.
And the section under "porting" is .. strange. It looks like to
use the XML API for a given Lisp I need to know enough
about the given implementation to configure various settings,
so if I wanted to learn Lisp by writing, say, a simple XML-RPC
client then I have to learn first what it means "to complete
definitions in the following files" and the details of "defsystem",
"package", "STREAM-READER / -WRITER", etc.

That reminds me of my confusion testing a biolisp package.
I needed to edit the file before it worked; something to do
with commenting/uncommenting the right way to handle
packages. I prefer to start with working code.


You can ask these things in comp.lang.lisp or in one of the various
mailing lists. Common Lispniks are generally very helpful.
Pascal

--
Pascal Costanza University of Bonn
mailto:co******@web.de Institute of Computer Science III
http://www.pascalcostanza.de Römerstr. 164, D-53117 Bonn (Germany)

Jul 18 '05 #275
"Karl A. Krueger" <kk******@example.edu> writes:

[SNIP]
Incidentally, I regard objections to "the whitespace thing" in Python
and objections to "the parenthesis thing" in Lisp as more or less the
same. People who raise these objections are usually just saying "Ick!
This looks so unfamiliar to me!" in the language of rationalizations.
I guess a philosopher would say that I am an emotivist about notation
criticisms.


My main problem with "indentation controls scoping" is taht I've
actually had production python code die because of whitespace being
mangled in cutting&pasting between various things. It looks a bit odd,
but after having written BASIC, Pascal, APL, Forth, PostScript, Lisp,
C and Intercal looking "odd" only requires looking harder. Killing a
production system due to whitespace-mangling isn't.

And, yes, I probably write more Python code than lisp code in an
average week.

//Ingvar
--
Ingvar Mattsson; in****@hexapodia.net;
You can get further with a kind word and a 2x4
than you can with just a kind word. Among others, Marcus Cole
Jul 18 '05 #276


Matthias wrote:

cs****@dtpq.com (Christopher C. Stacy) writes:
He probably means "operator overloading" -- in languages where
there is a difference between built-in operators and functions,
their OOP features let them put methods on things like "+".
[...]
And in Lisp if you want to do some
other kind of arithmetic, you must make up your names for those
operators. This is considered to be a good feature.
In comp.lang.lisp there was recently a thread discussing why not all
CL-types were also CL-classes and all functions CLOS-methods (so that
operator overloading would be possible). I think the outcome was more
or less "it happened by historic accident and it's easier to write
fast compilers then".


that is not an accurate restatement of the conclusion which i recall. i
suggest that more accurate summary would be:

1. should one need operators which "look" like the standard operators, but
which have a different defined semantics, one places their names in a package
which is isolated from :common-lisp, and either codes with reference to that
package or exports them from that package and codes with reference to a
package which inherits those symbols in preference to those exported from the
:common-lisp package.

2. one does not want to specialize the standard operators other than in the
ways which the standard permits, as not only other applications, but also the
implementation itself may depend on that they have the semantics which the
standard specifies.

In general, taking away flexibility from the
programmer is not in the spirit of Lisp, though.


one might argue, that the standard should have specified that a conforming
implementation not depend on the definitions named by symbols in the
:common-lisp package itself, but instead use it's internal functions. in order
to be convincing, the argument would need to identify use cases which option
(1.) does not support.

one can even rename the :common-lisp package and provide their one. one should
not, however, expect all programs to tolerate such a change.

....
Jul 18 '05 #277
"Andrew Dalke" <ad****@mindspring.com> writes:
If you want some real world numbers on program length check here:
http://www.bagley.org/~doug/shootout/
If I want some real world numbers on program length, I do it myself:
http://pleac.sourceforge.net/
I wrote most of the Python code there

Still, since you insist, I went to the scorecard page and changed
the weights to give LOC a multipler of 1 and the others a multiplier
of 0. This is your definition of succinctness, yes? This table
is sorted (I think) by least LOC to most.


<snip>
So:
- Why aren't you using Ocaml?
- Why is Scheme at the top *and* bottom of the list?
- Python is right up there with the Lisp/Scheme languages
- ... and with Perl.

Isn't that conclusion in contradiction to your statements
that 1) "Perl is *far* more compact than Python is" and 2)
the implicit one that Lisp is significantly more succinct than
Python? (As you say, these are small projects .. but you did
point out this site so implied it had some relevance.)


Apart from the usual problems with micro benchmarks, there are a few
things to consider regarding the LOC counts on that site:

* Declarations. Common Lisp gives the programmer the ability to
optimize a program by adding declarations to it. This is purely
optional, and something you normally don't do until you discover a
bottelneck in your code. For instance, it is possible to add type
declarations so that the compiler can generate more efficient
code. In a normal program, the declarations (if any) will
constitute an extremely small part of the program, but since the
micro benchmarks in the shootout are focused on speed of
execution, and they are so small, all of them contains a lot of
declarations, which will increase LOC.

* In many languages, any program can be written on a single
line. This goes for Lisp, ut also for C and other languages. This
means that the LOC count is also affected by formatting. For
instance, in the Ackermann's function benchmark, the Ackermann
function is written like this in the C code:

int Ack(int M, int N) { return(M ? (Ack(M-1,N ? Ack(M,(N-1)) : 1)) : N+1); }

That is, 1 LOC, although most people would probably write it in
anything between 5-10 lines.

* I don't think the LOC saving qualities of Lisp is made justice in
micro benchmarks. The reason Lisp code is so much shorter than the
equivalent code in other languages is because of the abstractive
powers of Lisp, which means that the difference will be more
visible the larger the program is.
Björn
Jul 18 '05 #278
"Andrew Dalke" <ad****@mindspring.com> writes:
pr***********@comcast.net:
So either the syntax doesn't make a whole hell of a lot of difference
in readability, or readability doesn't make a whole hell of a lot of
difference in utility.


Or the people who prefer the awesome power that is Lisp and
Scheme don't find the limited syntax to be a problem.


All evidence points to the fact that Lisp syntax is no worse than
Algol-style syntax. As Joe explained, other syntaxes have been used
for Lisp many times over the years, but lispers seem to prefer the
s-exp one. If anything, one could draw the conclusion that s-exp
syntax must be /better/ than Algol-style syntax since the programmers
who have a choice which of them to use -- for the same language --
apparently choose s-exp syntax. You really have no grounds to call
Lisp syntax limited.
Björn
Jul 18 '05 #279

i realize that this thread is hopelessly amorphous, but this post did
introduce some concrete issues which bear concrete responses...

Andrew Dalke wrote:

...

What's unicode support like in general for Lisp? Found an
answer in http://www.cliki.net/Unicode%20Support Digging
some more, it looks like CLisp uses .. UCS-4 and Unicode 3.2
(from clisp.cons.org). But do regexps work on unicode strings?
How portable are unicode strings? I figure they must be in
order to handle XML well. ... "ACL does not support 4 byte
Unicode scalar values" says franz.com. www.cl-xml.org says
"The processor passes 1749 of the 1812 tests .. when the base
implementation supports sixteen-bit charactrs." and that
MCL, LispWorks and the Allegro 5.0.1 international version
support 16-bit Unicode while Allegro ascii only supports 8bit.
So some have UCS-2 and some UCS-4.

Is there consensus on the Unicode API?
there are several problems with a "uniform" unicode implementation. if you
look through the info-mcl archives you will find a thread where i tried to
achieve some clarity as to what was necessary.

i got only as far as the realization that, in order to be of any use, unicode
data management has to support the eventual primitive string operations. which
introduces the problem that, in many cases, these primitive operations
eventually devolve to the respective os api. which, if one compares apple and
unix apis are anything but uniform. it is simply not possible to provide them
with the same data and do anything worthwhile. if it is possible to give some
concrete pointers to how other languages provide for this i would be grateful.

given this situation. i posted several suggestions as to how they might
represent unicode and effect encoding and decoding such that variations were
at least managed in a uniform manner. i received one (1) answer, to the effect
that "that sound's ok to me." so i left the implementation the way it was, to
signal an error upon discovering surrogate pairs. i've yet to have anyone
suggest that that impedes processing. to be quite honest, that surprises me
and i have no idea what people do with surrogate pairs.

this is effectively the same level of support as java, and i have to admit, i
don't understand what people really do with them in java either. the string
representation is effectively utf-16, so anything outside of the base plane is
not a first-class object. in which environment the "consensus" above should
actually be better spelled "chimera".

On the XML path, I found cl-xml. Looking at the bugs section in
http://pws.prserv.net/James.Anderson...on/cl-xml.html
It says "the implementation form for HTTP support is determined
at compilation time." Is it really true that even HTTP handling is
different on the different implementations?
yes, there are several available common-lisp implementations for http clients
and servers. they offer significant trade-offs in api complexity,
functionality, resource requirements and performance. you do need to pick one
according to your application needs and declare which one you have chosen. for
a default implementation of client functionality, cl-xml, as any other lisp
application, must take into acount that some necesary stream and
network-related functions are available through implementation-specific
libraries only. again, as for other common-lisp libraries, for the
implementation to which it has been ported, it does this automatically.

And the section under "porting" is .. strange. It looks like to
use the XML API for a given Lisp I need to know enough
about the given implementation to configure various settings,
so if I wanted to learn Lisp by writing, say, a simple XML-RPC
client then I have to learn first what it means "to complete
definitions in the following files" and the details of "defsystem",
"package", "STREAM-READER / -WRITER", etc.
if one needs to _port_ it to a new lisp, yes. perhaps you skipped over the
list of lisps to which it has been ported. if you look at the #+/-
conditionalization, you may observe that the differences are not significant.

That reminds me of my confusion testing a biolisp package.
I needed to edit the file before it worked; something to do
with commenting/uncommenting the right way to handle
packages. I prefer to start with working code.


it is refreshing, that you describe it as "your" confusion. there was one
correspondent who, at the outset, judging from their initial enquiries, was
looking at their first '(' ever, but wrote in short order about processing
12MB files. should one have problems using a given common lisp library,
concrete questions and illustrations of points which are unclear are always
more productive than vague chacterizations.

....
Jul 18 '05 #280
cs****@dtpq.com (Christopher C. Stacy) writes:
>> On Wed, 08 Oct 2003 16:51:44 -0400, Joe Marshall ("Joe") writes: Joe> "Carlo v. Dango" <oe**@soetu.eu> writes:
>> method overloading,


Joe> Now I'm *really* confused. I thought method overloading involved
Joe> having a method do something different depending on the type of
Joe> arguments presented to it. CLOS certainly does that.

He probably means "operator overloading" -- in languages where
there is a difference between built-in operators and functions,
their OOP features let them put methods on things like "+".

Lisp doesn't let you do that, because it turns out to be a bad idea.
When you go reading someone's program, what you really want is for
the standard operators to be doing the standard and completely
understood thing.


Though if one *really* wants to have +, -, * and / as generic
functions, I imagine one can use something along the lines of:

(defpackage "GENERIC-ARITHMETIC"
(:shadow "+" "-" "/" "*")
(:use "COMMON-LISP"))

(in-package "GENERIC-ARITHMETIC")
(defgeneric arithmetic-identity (op arg))

(defmacro defarithmetic (op)
(let ((two-arg
(intern (concatenate 'string "TWO-ARG-" (symbol-name op))
"GENERIC-ARITHMETIC"))
(cl-op (find-symbol (symbol-name op) "COMMON-LISP")))
`(progn
(defun ,op (&rest args)
(cond ((null args) (arithmetic-identity ,op nil))
((null (cdr args))
(,two-arg (arithmetic-identity ,op (car args))
(car args)))
(t (reduce (function ,two-arg)
(cdr args)
:initial-value (car args)))))
(defgeneric ,two-arg (arg1 arg2))
(defmethod ,two-arg ((arg1 number) (arg2 (number)))
(,cl-op arg1 arg2)))))

Now, I have (because I am lazy) left out definitions of the generic
function ARITHMETIC-IDENTITY (general idea, when fed an operator and
NIL, it returns the most generic identity, when fed an operator and an
argument, it can return a value that is more suitable) and there's
probably errors in the code, too.

But, in principle, that should be enough of a framework to build from,
I think.

//Ingvar
--
My posts are fair game for anybody who wants to distribute the countless
pearls of wisdom sprinkled in them, as long as I'm attributed.
-- Martin Wisse, in a.f.p
Jul 18 '05 #281
On Thu, 09 Oct 2003 07:43:40 GMT, "Andrew Dalke" <ad****@mindspring.com> wrote:
What's unicode support like in general for Lisp? [...] But do
regexps work on unicode strings?


Unicode support isn't part of the CL standard but the standard is
flexible enough to make it easy for implementations to integrate
Unicode characters and strings seamlessly. You've mentioned a couple
of integrations which do that.

As for regex support - that's not a part of the standard either, but
there a couple of libraries available - see

<http://www.cliki.net/Regular%20Expression>

If the library is written in Lisp (as opposed to being an FFI wrapper
around a C library) you can be fairly sure that it works with Unicode:

[19]> (code-char 1000)
#\COPTIC_CAPITAL_LETTER_HORI
[20]> (defparameter *target* (make-string 2 :initial-element *))
*TARGET*
[21]> (cl-ppcre::scan "^(.){2}$" *target*)
0 ;
2 ;
#(1) ;
#(2)
[22]> (cl-ppcre::scan `(:greedy-repetition 2 2 ,(code-char 1000)) *target*)
0 ;
2 ;
#() ;
#()

(This is CL-PPCRE with CLISP.)

Edi.
Jul 18 '05 #282


Andrew Dalke wrote:

What I said was that Python is *not* an application of
Greespun's Tenth Rule of programming because 1) it isn't
bug-ridden, and 2) because Python explores ideas which
which had no influence on Lisp's development -- user
studies of non-professional programmers.
I wouldn't take the Greenspun crack too seriously. That's about
applications recreating Lisp, not languages copying Lisp features. It's
just a reaction to Python (a perfectly nice little scripting language)
trying to morph into a language with the sophistication of Lisp.

As for non-professional programmers, the next question is whether a good
language for them will ever be anything more than a language for them.
Perhaps Python should just stay with the subset of capabalities that
made it a huge success--it might not be able to scale to new
sophistication without destroying the base simplicity.

Another question is whether Lisp would really be such a bad program for
them.

You presume that only Lisp gurus can learn Lisp because of the syntax.
But methinks a number of folks using Emacs Elisp and Autocad's embedded
Lisp are non-professionals. And let's not forget Symbolic Composer
(music composition) or Mirai (?) the 3D modelling/animation tool, both
of which are authored at the highest level with Lisp.

Logo (a language aimed at children, including very young children) cuts
both ways: it's a Lisp, but it dumped a lot of the parens, but then
again it relies more on recursion.

You (Alex?) also worry about groups of programmers and whether what is
good for the gurus will be good for the lesser lights. What you are
saying is that the guru will dazzle the dorks with incomprehensible
gobbledygook. That does happen, but those kinds of gurus should be fired.

To the contrary, macros in the hand of truly talented developers allow
the gurus to build Lisp up to a higher-level language with new
domain-specific constructs to empower the rest of the team.

You dissed Brooks in an earlier article (in favor of a Redmund Clone, of
all things) but you should go back and read him again. Especially No
Silver Bullet and NSB Revisited. He has a lot of great insights in
there. (Your Redmond boy is counting LOC to assess languages, apparently
because he can understand counting LOC. hmmm....)

Brooks talks about productivity coming from greater expressive power,
from having the language more capable of expressing things at the same
level at which the programmer is thinking. (He also touts Interlisp at
one point!) But in NSB he says languages have reached the conceptual
sophistication of their users. What Brooks missed (despite his awareness
of Interlisp, which he dug because of its interactivity) is what few
people understand, again, that macros let you build a domain-specific
HLL on top of Lisp.

On a suffciently large project (well, there's another point: with Lisp
one does not hire ten people (unless one is doing three projects)) the
team should be divided into those who will be building the
domain-specific embedded language and those who will be using it.
Ideally the latter could be not just non-professional programmers, but
even non-programmers.

Where are the user studies which suggested () over [], or that
"car" is better than "first"/"1st" or that "cdr" is better than
"rest"/"rst"?


Studies. You use that word so much. A good study is hard to find. You
loved McConnel's LOC nonsense, but it is worthless. Ooooh, numbers! Look
at all the numbers! [why [dont you] [just] [type out [a form [with [lots
of brackets]]] and [see what [they [look [like]]]]?

They [ook [ike He[[.

--
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey

Jul 18 '05 #283
"Daniel P. M. Silva" <ds****@ccs.neu.edu> wrote in message news:<bm**********@camelot.ccs.neu.edu>...
By the way, what's a non-professional programmer?


How about a person whose profession is not programming, but who often
writes computer programs?
--
G.
Jul 18 '05 #284
Dirk Thierbach wrote:
you can use macros to do everything one could use HOFs for (if you
really want).


I should have added: As long as it should execute at compile time, of
course.
Really? What about arbitrary recursion?


I don't see the problem. Maybe you have an example? I am sure the
Lisp'ers here can come up with a macro solution for it.


I'm not terribly familiar with the details of Lisp macros but since
recursion can easily lead to non-termination you certainly need tight
restrictions on recursion among macros in order to ensure termination of
macro substitution, don't you? Or at least some ad-hoc depth limitation.

- Andreas

--
Andreas Rossberg, ro******@ps.uni-sb.de

"Computer games don't affect kids; I mean if Pac Man affected us
as kids, we would all be running around in darkened rooms, munching
magic pills, and listening to repetitive electronic music."
- Kristian Wilson, Nintendo Inc.

Jul 18 '05 #285


Kenny Tilton wrote:


Andrew Dalke wrote:

What I said was that Python is *not* an application of
Greespun's Tenth Rule of programming because 1) it isn't
bug-ridden, and 2) because Python explores ideas which
which had no influence on Lisp's development -- user
studies of non-professional programmers.


Speaking of non-pros:

"Lisp is easy to learn

Lisp's syntax is simple, compact and spare. Only a handful of “rules”
are needed. This is why Lisp is sometimes taught as the first
programming language in university-level computer science courses. For
the composer it means that useful work can begin almost immediately,
before the composer understands much about the underlying mechanics of
Lisp or the art of programming in general. In Lisp one learns by doing
and experimenting, just as in music composition. "

From: http://pinhead.music.uiuc.edu/~hkt/nm/02/lisp.html

No studies, tho.

kenny

--
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey

Jul 18 '05 #286
Andreas Rossberg <ro******@ps.uni-sb.de> writes:
Dirk Thierbach wrote:
you can use macros to do everything one could use HOFs for (if you
really want).

I should have added: As long as it should execute at compile time, of
course.
Really? What about arbitrary recursion?

I don't see the problem. Maybe you have an example? I am sure the
Lisp'ers here can come up with a macro solution for it.


I'm not terribly familiar with the details of Lisp macros but since
recursion can easily lead to non-termination you certainly need tight
restrictions on recursion among macros in order to ensure termination
of macro substitution, don't you? Or at least some ad-hoc depth
limitation.


Same as with function calls, you mean?

--
Raymond Wiker Mail: Ra***********@fast.no
Senior Software Engineer Web: http://www.fast.no/
Fast Search & Transfer ASA Phone: +47 23 01 11 60
P.O. Box 1677 Vika Fax: +47 35 54 87 99
NO-0120 Oslo, NORWAY Mob: +47 48 01 11 60

Try FAST Search: http://alltheweb.com/
Jul 18 '05 #287
Dave Benjamin <da**@3dex.com> wrote in message news:<u%************@news1.central.cox.net>...
For instance, I always thought this was a cooler alternative to the
try/finally block to ensure that a file gets closed (I'll try not to
mess up this time... ;) :

open('input.txt', { |f|
do_something_with(f)
do_something_else_with(f)
})


But being a function, it'd have the nasty property of a
separate scope (yes, that can be nasty sometimes). I'd perhaps
want to do

open('input.txt', { |f| data = f.read() })

But alas, 'data' would be local to the anonymous function and
not usable outside. Perhaps I'd want to do this:

open('input.txt', { |f|
data = f.read()
return data.startswith('bar')
})

Well, unfortunately that return would only return from the
anonymous function. 'open' could return this result,
so the above could be done awkwardly:

return open('input.txt', { |f|
data = f.read()
return data.startswith('bar')
})

But instead for this, and for all the other objects that
require a resource to be freed after it's use, I think a separate
syntax would be preferable (or macros, but we'll never get those).
IIRC, this has been suggested several times before, with
varying syntax:

with f=file('input.txt'):
data = f.read()
print data[0:3]

That's it. Or for opening multiple files, here's a fabricated example:

with f1=file('1.txt'), f2=file('2.txt', 'w'):
data = f1.read()
if data.startswith('foo'):
break #break breaks out of 'with'
f2.write(data)
return True
print 'bleh'
return False
'with' can't be said to be non-explicit either. It'd be only
used with variables that do have resources to be released, so
what really happens is said clearly. C++ RAII could be considered
implicit on the other hand.

Hmm.. With those 'break' semantics, some might be tempted to use
'with' without any variables as well:

with:
if x == y:
x = 1
break
x = 0
if y == z:
y = 1
break
y = 0

In current Python, that'd have to be done like this (or with
a single-element for loop)

if x == y:
x = 1
else:
x = 0
if y == z:
y = 1
else:
y = 0

Hmm, that would lead to two different approaches, which
some might not like. Former is flatter though, at least
if you continue with similar condition/breaks
("Flat is better than nested.") ;)

On a second thought, maybe the break-suggestion was bad
after all. With such break, breaking outside of loop within
'with' wouldn't be so easy. And since 'continue' inside
'with' doesn't make sense, the following would be strange:

for x in range(5):
with f=file('data%d.txt' % x):
continue # would continue loop
break # would break out of 'with'
Ok, never mind 60% of this message then. Just consider
'with' without break (but with possibility to handle
multiple variables).
Jul 18 '05 #288
Kenny Tilton wrote:
Speaking of non-pros:

"Lisp is easy to learn

Lisp's syntax is simple, compact and spare. Only a handful of “rules”
are needed. This is why Lisp is sometimes taught as the first
programming language in university-level computer science courses. For
the composer it means that useful work can begin almost immediately,
before the composer understands much about the underlying mechanics of
Lisp or the art of programming in general. In Lisp one learns by doing
and experimenting, just as in music composition. "

From: http://pinhead.music.uiuc.edu/~hkt/nm/02/lisp.html

No studies, tho.


Here they are: http://home.adelphi.edu/sbloch/class/hs/testimonials/

(This is about Scheme.)
Pascal

--
Pascal Costanza University of Bonn
mailto:co******@web.de Institute of Computer Science III
http://www.pascalcostanza.de Römerstr. 164, D-53117 Bonn (Germany)

Jul 18 '05 #289
Pascal Costanza <co******@web.de> writes:
As you have already noted in another note, car and cdr can be
composed. cadr is the second element, caddr is the third, cadddr is
the fourth, and so on. cddr is the rest after the second element,
cdddr is the rest after the third element, and so on. Other
abbreviations I have used relatively often are caar, cdar, cadar.

These abbreviations seem strange to a Lisp outsider, but they are
very convenient, and they are easy to read once you have gotten used
to them. You don't actually "count" the elements in your head every
time you see these operators, but they rather become patterns that
you recognize in one go.


As a follow-on to Pascal's point: It might seem, if one just thinks
about function calls that the benefit of the composed C[AD]*R
operations is fairly small, and perhaps not worth the "cost" of being
more cryptic: i.e. Is the savings of a few characters in (cddr foo) vs
(rest (rest foo)) that big a benefit? But since Lisp supports higher
order functions, having single name for those composite functions
saves the clutter of having to create a lambda expression. For
instance, compare:

(loop for (x y) on list by #'cddr do (foo x y))

vs

(loop for (x y) on list by #'(lambda (l) (rest (rest l))) do (foo xy))
I figured this out by deciding--as a matter of style--that I was just
going to use FIRST/REST all the time and then noticing that in
situations like this, CDDR is much more convenient. The point being,
it's hard to forsee all the subtle ways different features interact.
So it can be simultaneously true that CAR and CDR were originally
choosen as names for pretty much arbitrary historical reasons *and*
that they have persisted for a lot of "hard-headed" but subtle
engineering reasons. (Or maybe soft-headed, aesthetic reasons, if you
care to draw the distinction when talking about programming language
design which I don't.)

-Peter

--
Peter Seibel pe***@javamonkey.com

Lisp is the red pill. -- John Fraser, comp.lang.lisp
Jul 18 '05 #290
Andreas Rossberg wrote:
Dirk Thierbach wrote:
you can use macros to do everything one could use HOFs for (if you
really want).

I should have added: As long as it should execute at compile time, of
course.
Really? What about arbitrary recursion?

I don't see the problem. Maybe you have an example? I am sure the
Lisp'ers here can come up with a macro solution for it.

I'm not terribly familiar with the details of Lisp macros but since
recursion can easily lead to non-termination you certainly need tight
restrictions on recursion among macros in order to ensure termination of
macro substitution, don't you? Or at least some ad-hoc depth limitation.


The Lisp mindset is not to solve problems that you don't have.

If your code has a bug then you need to debug it. Lisp development
environments provide excellent debugging capabilities out of the box.
Don't guess how hard it is when you don't have the experience yet.
Pascal

--
Pascal Costanza University of Bonn
mailto:co******@web.de Institute of Computer Science III
http://www.pascalcostanza.de Römerstr. 164, D-53117 Bonn (Germany)

Jul 18 '05 #291
Raymond Wiker wrote:

I'm not terribly familiar with the details of Lisp macros but since
recursion can easily lead to non-termination you certainly need tight
restrictions on recursion among macros in order to ensure termination
of macro substitution, don't you? Or at least some ad-hoc depth
limitation.


Same as with function calls, you mean?


In functional languages you at least have no limitation whatsoever on
the depth of tail calls. Is the same true for macros?

Apart from that, can you have higher-order macros? With mutual recursion
between a macro and its argument? That is, can you write a fixpoint
operator on macros?

I'm not saying that any of this would be overly useful. Just trying to
refute Dirk's rather general statement about macros subsuming HOF's.

- Andreas

--
Andreas Rossberg, ro******@ps.uni-sb.de

"Computer games don't affect kids; I mean if Pac Man affected us
as kids, we would all be running around in darkened rooms, munching
magic pills, and listening to repetitive electronic music."
- Kristian Wilson, Nintendo Inc.

Jul 18 '05 #292


David Rush wrote:
You know I think that this thread has so far set a comp.lang.* record
for civilitiy in the face of a massively cross-posted language
comparison thread. I was even wondering if it was going to die a quiet
death, too.

Ah well, We all knew it was too good to last. Have at it, lads!

Common Lisp is an ugly language that is impossible to understand with
crufty semantics

Scheme is only used by ivory-tower academics and is irerelevant to real
world programming

Python is a religion that worships at the feet of Guido vanRossum
combining the syntactic flaws of lisp with a bad case of feeping
creaturisms taken from languages more civilized than itself

There. Is everyone pissed off now?


You forgot the INTERCAL crowd :)

Cheers
--
Marco

Jul 18 '05 #293
> Ahh, but overloading only works at compile time:

void foo( SomeBaseObject* object );
void foo( SomeDerivedObject* object );

doesn't work if you're using a base class pointer for all your derived
classes.

I think that the point was that the overload resolution rules can handle the
situation. Nothing in these rules prevents them from being applied to a
dynamically dispatched case.
Jul 18 '05 #294
Dave Benjamin wrote (answering Mike Rovner):
...
"Explicit is better than implicit"
In that case, why do we eschew code blocks, yet have no problem with the
implicit invocation of an iterator, as in:

for line in file('input.txt'):
do_something_with(line)


I don't see that there's anything "implicit" in the concept that a
special operation works as indicated by its syntax. I.e., I do not
find this construct any more "implicit" in the first line than in
its second one, which is the juxtaposition of a name and a pair of
parentheses to indicate calling-with-arguments -- and alternatives
such as:

do_something_with.call_with_arguments(line)

aren't "more explicit", just more verbose.

Similarly, the fact that
file('input.txt')
(a call to a type object) creates and returns an object is not any
more "implicit" than it would be to have to call factory classmethods
a la:
file.open_for_reading_an_existing_textfile_named(' input.txt')
would not be "more explicit", just more verbose.

Simply juxtaposing parentheses right after a callable CALLS it,
because that syntax is defined to be THE syntax for such calls in
Python. Similarly, simply prepending "for xx in" before an iterable
ITERATES ON it, because that syntax is defined to be THE syntax for
such iteration in Python. Neither is "less explicit" than verbose
alternatives requiring (e.g.) access to attributes on the callable
or iterable object. Such access to attributes could not (by first
class objects rule -- "everything's an object") produce anything
BUT objects -- so where does one stop...? x.call.call.call.call...???

This has nothing to do with "eschewing code blocks", btw; code blocks
are not "eschewed" -- they are simply syntactically allowed, as
"suites", only im specific positions. If Python's syntax defined
other forms of suites, e.g. hypothetically:

with <object>:
<suite>

meaning to call the object (or some given method[s] in it, whatever)
with the suite as its argument, it would be just as explicit as, e.g.:

for <name> in <object>:
<suite>

or

<object>(<object>)

are today. Whether it would be wise, useful, etc, etc, is a different
set of issues, but I disagree that there is relevance here of the
implicit vs explicit principle (I know the poster you were replying to
did first claim that principle mattered here, but I just disagree with
him as well:-). Of course, if we did adopt that 'with' or similar
syntax we'd also have to decide on the TYPE of a 'suite' thus literally
expressed, an issue which current syntax constructs using suites do
not have -- perhaps a code object, perhaps a callable (but in the
latter case you'r probably also want 'arguments' -- so the syntax might
have to be slightly more extensive, e.g. "with <object>, <name1>, <name2>:"
instead). But that seems a secondary issue to me.

This is not to say that I dislike that behavior; in fact, I find it
*beneficial* that the manner of looping is *implicit* because you can
substitute a generator for a sequence without changing the usage. But
You could do so even if you HAD to say iter(<object>) instead of
just <object> after every "for <name> in" -- it wouldn't be any
more "explicit", just more verbose (redundant, boiler-platey). So
I do not agree with your motivation for liking "for x in y:" either;-).
there's little readability difference, IMHO, between that and:

file('input.txt').each_line({ |line|
do_something_with(line)
})
Not huge, but the abundance of ({ | &c here hurts a little bit.

Plus, the first example is only obvious because I called my iteration
variable "line", and because this behavior is already widely known. What
if I wrote:

for byte in file('input.dat'):
do_something_with(byte)

That would be a bit misleading, no? But the mistake isn't obvious. OTOH,
in the more explicit (in this case) Ruby language, it would look silly:

open('input.txt').each_line { |byte|
# huh? why a byte? we said each_line!
}
Here, you're arguing for redundance, not for explicitness: you are claiming
that IF you had to say the same thing more than once, redundantly, then
mistakes might be more easily caught. I.e., the analogy is with:

file('foo.txt').write('wot?')

where the error is not at all obvious (until runtime when you get an
exception): file(name) returns an object *open for reading only* -- so
if you could not call file directly but rather than do say, e.g.:

file.open_for_reading_only('foo.txt').write('wot?' )

the contrast induced by the mandated redundance might (one hopes)
make the error "look silly". Many languages do rather enthusiastically
embrace this, making you write more redundant boilerplate than useful
code connected with your program's task, or so it seems at times --
neither Ruby nor Python, however, go in for such systematic use of
redundance in general. In any case, redundance and explicitness are
separate concepts: if you have to express something more than once,
that is redundance -- if you have to express it (once or more) rather
than having the language guess on your behalf, that is explicitness.

Having sensible defaults does not necessarily violate explicitness,
btw. E.g., the reason we have to say "class x(object):" today is NOT
"just to be explicit" -- it's an unfortunate consequence of the need
to continue having old-style classes (and the prudent choice to keep
them as the default to ensure slow, smooth migration); an issue of
legacy, backwards compatibility, and concern for the existing body of
code, in other words, rather than of "implicit vs explicit". Once we
proceed on the slow process of burying classic classes, we can make
object the default base _without_ damaging anything. Of course, one
COULD puristically disagree -- but, practicality beats purity...;-).

I think this is important to point out, because the implicit/explicit
rule comes up all the time, yet Python is implicit about lots of things!
To name a few:

- for loops and iterators
Already addressed above: nothing implicit there.
- types of variables
There are none, so how could such a nonexisting thing be EITHER implicit
OR explicit? Variables don't HAVE types -- OBJECTS do.

Etc, etc -- can't spend another 1000 lines to explain why your "lots of
things" do not indicate violations of "explicit is better than implicit".

If all you're saying is that naming something is better than not naming
something because explicit is better than implicit, I'd have to ask why:
Sometimes it is (to avoid perilous nesting), sometimes it isn't (to
avoid wanton naming). I generally don't mind naming things, but it IS
surely possible to overdo it -- without going to the extreme below,
just imagine a language where ONLY named argument passing, and no use
of positional arguments, was allowed (instead of naming arguments being
optional, as it is today in Python).

Even your example clearly shows that try block is much more readable and
understandable.
That's why it's being considered evil by majority of python developers.
I don't agree with Mike that try/finally is particularly readable. Yes,
it IS understandable, but its lack of support for [a] an explicit
initialization phase and [b] distinction, when needed, between normal
and unnormal exists, does lead to frequent problems -- e.g.:

try:
f = open('goo.gah')
process_file(f)
finally:
f.close()

this is a FREQUENT bug -- if open fails, and thus f remains unbound,
the finally clause will STILL try to call close on it. Psychologically
it comes natural to write the initialization INSIDE the try/finally,
but technically it should be inside. Also, when the actual actions
require more than one object, try/finally leads to deep nesting, and
flat is better than nested, e.g.:

fs = open(xx, 'r')
try:
f1 = open(x1, 'w')
try:
f2 = open(x2, 'w')
try:
skip_prefix(fs)
split_from_to(pred, fs, f1, f2)
add_postfix(f1)
add_postfix(f2)
finally:
f2.close()
finally:
f1.close()
finally:
fs.close()

In a word, "yeurgh".

Not that the introduction of "block syntax" would be a panacea here,
necessarily. But claiming there is no problem with try/finally is,
IMHO, kidding ourselves.

Readability is a moving target. I think that the code block syntax
strikes a nice balance between readability and expressiveness. As far as
Maybe. I'm still not sold, though I think I do understand well why
one would WANT a literal form for code blocks. But some of the
use cases you give for 'not naming' -- e.g, a return statement --
just don't sit right with the kind of syntax that I think might help
with many of them (e.g. the 'with' keyword or something similar, to
pass blocks as arguments to callables, ONLY -- that's all Ruby allows,
too, so your 'return' use case above-mentioned wouldn't work all that
well there, either, though its lack of expression/statement split may
perhaps help a little).

If a Pythonic syntax can't be found to solve ALL use cases you've
raised, then the "balance" may be considered not nice enough to
compensate for the obvious problem -- a serious case of MTOWTDI.
what the majority of Python developers consider evil, I don't think
we've got the stats back on that one.
I don't think anybody has "stats", but following python-dev
regularly does give you a pretty good sense of what the consensus
is on what issues (it matters up to a point, since in the end Guido
decides, but, it does matter somewhat). MTOWTDI, for example, is
a dead cert for a chorus of boos -- even when the existing WTDI is
anything but "obvious", e.g. reduce(operator.add, somelist) in 2.2
and before, proposing an obvious alternative. e.g. sum(somelist) in
2.3, is SURE to draw some disagreement (good thing, in this case,
that Guido overruled the disagremeent and adopted the 'sum' builtin).

So, the emergence of a way to write, e.g.:

"""
def loop_on_each_byte(filename):
def looping_callable(block):
...block(byte)...
return looping_callable

with loop_on_each_byte(filename), byte:
process_byte(byte)
"""

as an OWTDI from

"""
def each_byte(filename):
...yield byte...

for byte in each_byte(filename):
process_byte(byte)
"""

would SURELY draw a well-justified roar. The benefits would have
to be very overwhelming indeed to overcome this issue. But if we
were to support "return this literal code block", for example,
then the code block literal syntax would have to be an expression
rather than a suite -- and I just can't find a good way to do
THAT. And if we don't support use cases which advocates of such
a new construct, like you, quote fondly -- and, I repeat, the
"why should I have to name something I am just going to return"
WAS one of the few use cases for literal code blocks you brought,
even though it's not directly supported in Ruby (or Smalltalk, as
fas as I know). So, I suspect there may be no good solution.

But the anonymous version still looks more concise to me.


Python prioritize things diferently than other languages.
It's not an APL. "Readability counts"


This is nothing like APL... if anything, it's like Smalltalk, a language
designed to be readable by children!


Cite pls? I knew that Logo and ABC had been specifically designed
with children in mind, but didn't know that of Smalltalk.
I realize that APL sacrificed
readability for expressiveness to an uncomfortable extreme, but I really
think you're comparing apples and oranges here. List comprehensions are
closer to APL than code blocks.


As an ex-user of APL (and APL2) from way back when, I think you're
both talking through your respective hats: neither list comprehensions
(particularly in the Python variation on a Haskell theme, with
keywords rather than punctuation) nor code blocks resemble APL in the least.
Alex

Jul 18 '05 #295
Paul Rubin wrote:
Kenny Tilton <kt*****@nyc.rr.com> writes:
I think Python's problem is its success. Whenever something is
succesful, the first thing people want is more features. Hell, that is
how you know it is a success. The BDFL still talks about simplicity,
but that is history. GvR, IMHO, should chased wish-listers away with
"use Lisp" and kept his gem small and simple.


That's silly. Something being successful means people want to use it
to get things done in the real world. At that point they start
needing the tools that other languages provide for dealing with the
real world. The real world is not a small and simple place, and small
simple systems are not always enough to cope with it. If GVR had kept
his gem small and simple, it would have remained an academic toy, and
I think he had wide-reaching ambitions than that.


I disagree, somewhat. Simplicity is not just history: it's still a
principle Pythonistas cherish -- and 3.0 will be about restoring some
of that, not by removing "tools you NEED for dealing with the real
world", but at least some of the MTOWTDI that HAS crept in by instead
adding a few of the "tools that *other languages* provide". Sure,
practicality beats purity -- and while simple is better than complex,
complex is better than complicated. I further argue that GvR *HAS*
"kept his [gem? what gem? it's Python, not Ruby!] small and simple" --
not QUITE as small and simple as it might have been kept in the best
of all possible worlds, but still outstandingly so compared with other
languages of comparable power and ease.

And Kenny's suggestion to "chase wish-listers away" is excellent --
one can use Dylan or C# or O'CAML or whatever else as an alternative
to Lisp, if that's what will best get them to stop bleating. Besides,
"if you want PL/I you know where to find it" has nice precedents (in
the only other language which was widely successful in the real world
while adhering to "provide only one way to perform an operation" as
one of its guiding principles -- not perfectly, but, close enough:-).
Alex

Jul 18 '05 #296
"Andrew Dalke" <ad****@mindspring.com> writes:
The tricky thing about using McConnell's book is the implications
of table 31-2 in the section "Using Rapid Development Languages",


This thing has been debunked for years. No one with a clue takes it
seriously. Even the author(s) indicate that much of it is based on
subjective guesses.

/Jon

Jul 18 '05 #297
"Carlo v. Dango" <oe**@soetu.eu> writes:
yes this mail is provocative.. please count slowly to 10 before


Nah, it's not provocative - it's simply stupid.

/Jon
Jul 18 '05 #298
Andrew Dalke wrote:
Doug Tolton:
Yes and I have repeatedly stated that I disagree with it. I simply do
not by that allowing expressiveness via high level constructs detracts
from the effectiveness of the group. That argument is plainly
ridiculous, if it were true then Python would be worse than Java,
because Python is *far* more expressive.

I disagree with your summary. Compare:

The argument is that expressive power for a single developer can, for
a group of developers and especially those comprised of people with
different skill sets and mixed expertise, reduce the overall effectiveness
of the group.

Notice the "can". Now your summary is:

...allowing expressiveness via high level constructs detracts
from the effectiveness of the group

That implies that at least I assert that *all* high level constructs
detract from group effectiveness, when clearly I am not saying
that.
If this is indeed the crux, then any justification which says "my brain"
and "I" is suspect, because that explicitly ignores the argument.


Apparently you can't read very well. I simply stated that I believe our
point of contention to be that issue, I never stated I believe that
because it's some vague theory inside my head.

Nor can you, because I did not say that. I said that the arguments you
use to justify your assertions could be stronger if you were to include
cases in your history and experience which show that you understand
the impacts of a language feature on both improving and detracting from
a group effort. Since you do have that experience, bring it up. But
since your arguments are usually along the lines of "taking tools out
of your hands", they carry less weight for this topic.

(Ambiguity clarification: "your hands" is meant as 2nd person singular
possessive and not 2nd person plural. :)

No, what I was referring to wasn't estimation. Rather I was referring
to the study that found that programmers on average write the same
number of lines of code per year regardless of the language they write
in.

McConnell's book has the same study, with outliers for assembly
and APL. Indeed, I mentioned this in my reply:
... and that LOC has a
good correlation to development time, excluding extremes like APL
and assembly.


Therefore the only way to increase productivity is to write
software in a language that uses less lines to accomplish something
productive. See Paul Grahams site for a discussion.

I assume you refer to "Succinctness is Power" at
http://www.paulgraham.com/power.html

It does not make as strong a case as you state here. It argues
that "succintness == power" but doesn't make any statement
about how much more succinct Lisp is over Python. He doesn't
like Paul Prescod's statement, but there's nothing to say that
Python can't be both easier to read and more succinct. (I am
not making that claim, only pointing out that that essay is pure
commentary.)

Note also that it says nothing about group productivity.
If it takes me 5% longer to write a program in language X
then language Y, but where I can more easily use code and
libraries developed by others then it might be a good choice
for me to use a slightly less succinct language.

Why don't people use APL/J/K with it's succinctness?

I also disagree with Graham's statement:
the most accurate measure of the relative power of
programming languages might be the percentage of
people who know the language who will take any job
where they get to use that language, regardless of the
application domain.

I develop software for computational life sciences. I would
do so in Perl, C++, Java, even Javascript because I find
the domain to be very interesting. I would need to be very
low on money to work in, say, accounting software, even if
I had the choice of using Python.
You are saying that Python and Perl are similarly compact?!?
You have got to be kidding right?
Perl is *far* more compact than Python is. That is just ludicrous.

Yes. In this I have a large body of expertise by which to compare
things. Perl dominates bioinformatics sofware development, and the
equivalent Python code is quite comparable in side -- I argue that
Python is easier to understand, but it's still about the same size.

It's always nice just to chuck some arbitrary table into the
conversation which conveniently backs some poitn you were trying to
make, and also conveniently can't be located for anyone to check the
methodology.

"Can't be located"!?!?! I gave a full reference to the secondary material,
included the full quote (with no trimming to bias the table more my way),
gave the context to describe the headings, and gave you a reference
to the primary source! And I made every reasonable effort to find both
sources online.

Since you can't be suggesting that I tracked down and destroyed
every copy of McConnell's book and of the primary literature (to make
it truely unlocatable) then what's your real complaint? That things exist
in the world which aren't accessible via the web? And how is that my
fault?

If you want some real world numbers on program length check here:
http://www.bagley.org/~doug/shootout/

If I want some real world numbers on program length, I do it myself:
http://pleac.sourceforge.net/
I wrote most of the Python code there

Still, since you insist, I went to the scorecard page and changed
the weights to give LOC a multipler of 1 and the others a multiplier
of 0. This is your definition of succinctness, yes? This table
is sorted (I think) by least LOC to most.

SCORES
Language Implementation Score Missing
Ocaml ocaml 584 0
Ocaml ocamlb 584 0
Ruby ruby 582 0
Scheme guile 578 0
Python python 559 0
Pike pike 556 0
Perl perl 556 0
Common Lisp cmucl 514 0
Scheme bigloo 506 1
Lua lua 492 2
Tcl tcl 478 3
Java java 468 0
Awk mawk 457 6
Awk gawk 457 6
Forth gforth 449 2
Icon icon 437 7
C++ g++ 435 0
Lisp rep 427 3
Haskell ghc 413 5
Javascript njs 396 5
Erlang erlang 369 8
PHP php 347 9
Emacs Lisp xemacs 331 9
C gcc 315 0
SML mlton 284 0
Mercury mercury 273 8
Bash bash 264 14
Forth bigforth 264 10
SML smlnj 256 0
Eiffel se 193 4
Scheme stalin 131 17

So:
- Why aren't you using Ocaml?
- Why is Scheme at the top *and* bottom of the list?
- Python is right up there with the Lisp/Scheme languages
- ... and with Perl.

Isn't that conclusion in contradiction to your statements
that 1) "Perl is *far* more compact than Python is" and 2)
the implicit one that Lisp is significantly more succinct than
Python? (As you say, these are small projects .. but you did
point out this site so implied it had some relevance.)

I just don't buy these numbers or the chart from Mcconell on faith. I
would have to see his methodolgy, and understand what his motivation in
conducting the test was.

I invite you to dig up the original paper (which wasn't McConnell)
and enlighten us. Until then, I am as free to agree with McConnell --
more so because his book is quite good and comprehensive with
sound arguments comparing and contrasting the different
approaches and with no strong hidden agenda that I can detect.

It still wasn't relevant to Macros. However, because neither of you
understand Macros, you of course think it is relevant.

My lack of knowledge not withstanding, the question I pose to
you is, in three parts:
- is it possible for a language feature to make a single programmer
more expressive/powerful while hindering group projects?

Yes I believe this to be the case. However in my own experience even
working with language such as Visual Basic and Java (which are far less
expressive than Python), people give me code that is so obfuscated that
is could compete in the Perl contenst.

In my experience, it hasn't been expressiveness per se that caused the
most problems. It has been lack of familiarity with sound software
engineering concepts, or more specific lack of experience building real
world applications.

So the short answer, is that *any* operator / feature used incorrectly
can cause massive confusion. I've seen this with simple operators such
as loop (ever seen seven nested loops doing different things at
different levels? It's can be ugly) - can you list three examples of situations where that's occured? Hmm, does everythime I've read someone elses code count? ;)
In seriousness, I have yet to be on any serious project where someone
doesn't do something that I disagree with. Personally though, I haven't
run across a problem where a cleanly implemented abstraction (ie class,
macro, HOF or metaclass) has caused a loss of productivity. In my
experience it has been quite the opposite.

Most of the development teams that I've worked on have gravitated
towards two groups. Those who write utilities and substrates for the
development framework, and those who consume them. This has happened
even if not specified by management, simply because those with the
ability to write reusable abstractions end up doing it a lot. I have
personally seen on numerous occaisions development speed up greatly when
the proper high level constructs were in place.
- can you list one example where the increased flexibility was, in
general, a bad idea? That is, was there a language which would
have been better without a language feature. I don't necessarily believe that to be the case. Certainly I can list
cases where utilizing a certain feature for a certain problem has been a
bad idea. That doesn't general to the language would be better without
the feature though. For that to be the case, IMO, there would have to
be *no* redeaming value to the feature, or it's use would have to be so
massively problematic that it nearly always causes problems.

I can't think of any feature off hand where I would say "take it out of
the language, that's just stupid". Perhaps there are some, and I'm just
missing them while I'm thinking about it.

One example of mis-use that caused some serious headaches:
Back in 1999 I was lead on a team building a heavy duty enterprise web
application. Management decided that our best choice was to use Visual
Basic and MTS. The system had to be scalable, it had to be extremely
fault tolerant and it had to be very flexible. The architecture
initially decided upon was to have three web servers, two application
servers and a fully fault tolerant sql server.

Based on the initial reports from MS we decided to test DCOM from the
web servers to the application servers (remember when that was the big
fad?). We quickly found out that our performance was terrible, and
couldn't scale to support our minimum required users. Switching things
around we re-configured and went with five web servers each running the
MTS components locally.

Another problem we ran into was that we decided to test out the XML
hype. All of our messaging between objects and between systems was sent
via XML payloads. This turned out to be extremely slow, and we ended up
ripping out most of the XML messaging guts in order to spead up the system.

We also encountered serious problems with people not knowing how to
efficiently utilize a SQL Server. For instance they would get a
recordset from each table (rather than joining) and then loop through
each recordset comparing the values and constructing their resultset.
Rewriting the queries to properly utilize joins and where clauses
yielded several orders of magnitude performance increases.
Note that I did not at all make reference to macros. Your statements
to date suggest that your answer to the first is "no."

That's not exactly my position, rather my position is that just about
anything can and will be abused in some way shape or fashion. It's a
simple fact of working in teams. However I would rather err on the side
of abstractability and re-usability than on the side of forced restrictions.

Jul 18 '05 #299
In article <bm**********@bob.news.rcn.net>, "Vis Mike"
<visionary25@_nospam_hotmail.com> wrote:
"Erann Gat" <my************************@jpl.nasa.gov> wrote in message
news:my*****************************************@k-137-79-50-101.jpl.nasa.go
v...
In article <xc*************@famine.OCF.Berkeley.EDU>,
tf*@famine.OCF.Berkeley.EDU (Thomas F. Burdick) wrote:
> method overloading,

How could you have both noncongruent argument lists, and multiple
dispatch?


C++ seems to manage it somehow.

#include <stdio.h>

void foo(int x, int y) { printf("1\n"); }
void foo(double x, int y) { printf("2\n"); }
void foo(char* x) { printf("3\n"); }

main() {
foo(1,2);
foo(1.2,2);
foo("foo");
}

compiles and runs without complaint.

E.


Ahh, but overloading only works at compile time:


That's irrelevant. When it happens doesn't change the fact that this
proves it (multiple dispatch with non-congruent arglists) is possible.
Nothing prevents you from using the same algorithm at run time.

E.
Jul 18 '05 #300

This discussion thread is closed

Replies have been disabled for this discussion.

By using this site, you agree to our Privacy Policy and Terms of Use.