473,385 Members | 2,210 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

Python vs. Lisp -- please explain

Hi, I've been thinking about Python vs. Lisp. I've been learning
Python the past few months and like it very much. A few years ago I
had an AI class where we had to use Lisp, and I absolutely hated it,
having learned C++ a few years prior. They didn't teach Lisp at all
and instead expected us to learn on our own. I wasn't aware I had to
uproot my thought process to "get" it and wound up feeling like a
moron.

In learning Python I've read more about Lisp than when I was actually
trying to learn it, and it seems that the two languages have lots of
similarities:

http://www.norvig.com/python-lisp.html

I'm wondering if someone can explain to me please what it is about
Python that is so different from Lisp that it can't be compiled into
something as fast as compiled Lisp? From this above website and
others, I've learned that compiled Lisp can be nearly as fast as C/C++,
so I don't understand why Python can't also eventually be as efficient?
Is there some *specific* basic reason it's tough? Or is it that this
type of problem in general is tough, and Lisp has 40+ years vs Python's
~15 years?
Thanks
Michael

Feb 19 '06
118 6558
Hallöchen!

Carl Friedrich Bolz <cf****@gmx.de> writes:
Torsten Bronger wrote:
[...]

My definiton would be that an interpreted language has in its
typical implementation an interpreting layer necessary for
typical hardware. Of couse, now we could discuss what is
"typical", however, in practice one would know it, I think. In
case of Python: CPython and all important modern processors.
Well, if we take any modern Intel/AMD chip (which could be
described as "typical), a C++ program would fit the "interpreted"
definition, since the processor does not execute the machine code
directly but rather breaks it down into smaller microcode
instruction -- a process that could be described as intepretation.


This is an interpreting layer within the hardware, not necessary for
it.
Another problem with the definition: what would you call a C++
program that is running on top of an emulator?
Compiled. I said "necessary for typical hardware".
[...] I think that the disctinction between "interpreted" and
"compiled" (whatever both means) is really just not sensible at
all.


The question is whether such features have to be considered when
choosing the right tool for a task. I think, yes. Whereas C is
very close to the fastest code you can get because it works very
closely to how the machine itself works, Python can well be one or
one and a half orders of magnitude farther away. No problem since
you can get the best of both worlds but: You must be aware of it.

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus ICQ 264-296-646
Feb 20 '06 #51
"Fredrik Lundh" <fr*****@pythonware.com> writes:
Alexander Schmolck wrote:
My point was that Guido probably (and fortunately!) was unaware of the extent
to which you can have both dynamism and speed

For the convenience of other readers, allow me to restore the snipped second
half of that sentence: "... and the extent to which very
dynamic languages are suitable for writing robust software."
any my point was that chosing to ignore something doesn't mean
that you're ignorant.
Interviewer: "You said originally you thought 500 lines would be a big
Python program."
Guido van Rossum: "That was just my lack of imagination."

<http://www.artima.com/intv/pyscale3.html>
Guido van Rossum: "Another thing, much farther in the future, is
compilation to C or machine code. I used to think that this was impossible
and (perhaps because of that) uninteresting, but recent experiments (like
Armin Rigo's Psyco and Greg Ewing's Pyrex) suggest that this will
eventually be possible. It should provide Python with an incredible
performance boost and remove many of the reasons why many people are still
reluctant to switch to Python."

<http://www.onlamp.com/pub/a/python/2002/06/04/guido.html?page=1>

(but since you keep repeating this nonsense, it's clear that you're
pretty ignorant wrt. software design. too much CS exposure?).


Indeed. Your amazing reading comprehesion and lucid argumentation would
obviously be lost on my posts.

'as
Feb 20 '06 #52
Quoth Carl Friedrich Bolz <cf****@gmx.de>:
| Torsten Bronger wrote:
|> Well, I think that it's fair to say that there are by principle deep
|> run time differences between CPython and, say, a typical
|> C++-compiled program. Your definition would not reproduce that. I
|> think it's also fair to say that these differences should be known
|> if somebody tries to find the best tool for a job. After all, they
|> include advantages, too.
|>
|> My definiton would be that an interpreted language has in its
|> typical implementation an interpreting layer necessary for typical
|> hardware. Of couse, now we could discuss what is "typical",
|> however, in practice one would know it, I think. In case of Python:
|> CPython and all important modern processors.
|
| Well, if we take any modern Intel/AMD chip (which could be described as
| "typical), a C++ program would fit the "interpreted" definition, since
| the processor does not execute the machine code directly but rather
| breaks it down into smaller microcode instruction -- a process that
| could be described as intepretation.

That's irrelevant, though. Note, "has in its typical implementation".
Your processor didn't come along with your C++ compiler, it's not part
of its implementation or even necessarily relevant to it - maybe some
potential for optimization, but the only hard and fast requirement is
that the processor must execute its instruction set as documented.

The reason this isn't just an abstruse philosophical argument where it
makes sense for us to obtusely cling to some indefensible point of view,
is that as the man points out, there are differences that we can't hide
forever from potential Python users. The most obvious to me is that
your Python program essential includes its interpreter - can't go anywhere
without it, and any change to the interpreter is a change to the program.
There are various strategies to address this, but pretending that Python
isn't interpreted is not one of them.

Donn Cave, do**@drizzle.com
Feb 20 '06 #53
In article <yf*************@oc.ex.ac.uk>,
Alexander Schmolck <a.********@gmail.com> wrote:
Feb 20 '06 #54
Bruno Desthuilliers <bd*****************@free.quelquepart.fr> writes:
Alexander Schmolck a écrit :
Bruno Desthuilliers <bd*****************@free.quelquepart.fr> writes:
DH a écrit :
(snip)

It is by design. Python is dynamically typed. It is essentially an
interpreted scripting language like javascript or ruby or perl,
It's not a "scripting" language, and it's not interpreted.

Of course it is. What do you think happens to the bytecode?

Ok, then what do you think happens to 'machine' code ?


It gets interpreted (by a CPU, for example) -- your point being? If it is that
my definition of interpreted language would be overly inclusive, note that the
difference between interpreting byte code is a task that at least arguably
still belongs to the python language proper (and there even is a public
bytecode interface) whereas interpreting machine code typically falls outside
the tasks and specifications of the language that generated it. In any event,
I didn't make any claims that some particular language is *not* interpreted.

"interpreted" usually means "no compilation, all parsing etc redone at each
execution", which is not the case with a bytecode/vm based implementation.
When people (including the python website maintainers) talk about interpreted
languages they often include all languages that don't compile to machine code.
"Interpreted language" is not a terribly well defined concept and the usage of
words also changes to reflect general trends. Since pretty much no language of
any significance today is still interpreted in the sense you hold to be
canonical, when people talk about interpreted languages nowadays they (IMHO)
typically mean something else. This is just Grice in action.
And if python
isn't a scripting language, then what on earth is?

bash is a scripting language for *n*x systems. javascript is a scripting
language for web browsers. VBScript is a scripting language for MS
applications.


Python is also a scripting language for *n*x systems and various applications.
You might want to argue about whether scriping language is a meaningful and
useful concept,


A scripting languagee is a language whose main purpose is to be embbeded in an
application to provide the user a way of programmaticaly automate some tedious
tasks.


A few lines ago bash was also a scripting language but I can't personally
recall ever seeing bash embedded in some application.
Now you could of course argue about what is an application...
but it's really hard to see how you could talk about "scripting
languages" without including python.
Ho, really ? How many applications using Python as scripting language ?


I'd guess dozens. Several Gnome and Kde programs, some scientific software, a
number of games and other commercial software such as UML editors.
And how many applications written in Python ?
No idea. My guess would be that there are far fewer high-profile end user
applications written in python than embedding python, though.
Python *can* be used as a scripting language (and is not too bad at it), but
it *is* not a scripting language.

Interviewer: "Why did you create Python in the first place?"

Guido: "We wanted Amoeba to be as useful as Unix for our daily work, but it was
lacking a scripting language. So I set out to design my own."

Summary of my position: it's futile pedantry trying to correct someone who
claims that python is an interpreted scripting language, since neither of
these terms are terribly well defined and your definitions don't even seem to
be internally fully consistent and certainly have the appearance of
disagreeing with those of the creator of the language.

If you dislike people refering to python as "interpreted" you can always add
that it compiles to bytecode. But the poster you corrected was already aware
of that and mentioned it himself.

'as
Feb 20 '06 #55
"Donn Cave" <do**@drizzle.com> writes:
Quoth Alexander Schmolck <a.********@gmail.com>:
| "Fredrik Lundh" <fr*****@pythonware.com> writes:
...
|> the only even remotely formal definition I've ever seen is "language with
|> designed to script an existing application, with limited support for handling
|> its own state".
|
|> Early Tcl and JavaScript are scripting languages, Python is not.
|
| Right. Which shows that by this definition scripting language is not a
| meaningful and useful concept. No one will understand you correctly when you
| refer to "scripting language" and mean only something like the above -- and
| unless you spend a lot of your time talking about early tcl and early
| javascript I doubt you'd need a word for it, either.

Oddly enough, that's what I understand it to mean, too, so you can't
strictly say "no one".
Really? If someone talked about scripting languages without further
qualifications and much context would you automatically take that to exclude
modern Javascript, modern Tcl and python, perl, ruby etc.?

Or would you just think to yourself 'Ah, probably again someone who uses
"scripting language" imprecisely or incorectly'?

In pretty much any case I can think of all somewhat prominent languages even
those that started out purely in the context of scripting one particular
application (such as arguably javascript, although that sense of scripting is
definitely again distinct from the sense "providing application customization
and extension by users") by now have ursurped other tasks and don't fall
strictly under the given definition anymore. So my impression is that since
"scripting languages" as above would only refer to a very obscure set of
programming languages, almost no one uses the term strictly in that way.

This meaning can always be expressed by "application (specific) scripting
language" -- but what would you use to refer to "perl, python, ruby et al"?
On the other hand, I think it's obvious that a language like Python could
be used for scripting, without having been specifically designed for it as
described above.

Interviewer: "Why did you create Python in the first place?"

Guido: "We wanted Amoeba to be as useful as Unix for our daily work, but it was
lacking a scripting language. So I set out to design my own."

So Guido certainly designed it as a "scripting language", but since the term
is so vague, he might

There's an ambiguity in the phrase, out of context - I can say "Python can
serve as a scripting language for some applications", but not "Python is a
scripting language!", since its place in the taxonomy of languages would be
somewhere else.


I definitely agree that scripting language is rather ambiguous.

'as
Feb 20 '06 #56
Quoth Alexander Schmolck <a.********@gmail.com>:
| Bruno Desthuilliers <bd*****************@free.quelquepart.fr> writes:
....
|> bash is a scripting language for *n*x systems. javascript is a scripting
|> language for web browsers. VBScript is a scripting language for MS
|> applications.
|
| Python is also a scripting language for *n*x systems and various applications.
|
|> A scripting languagee is a language whose main purpose is to be embbeded in an
|> application to provide the user a way of programmaticaly automate some tedious
|> tasks.
|
| A few lines ago bash was also a scripting language but I can't personally
| recall ever seeing bash embedded in some application.

UNIX! The Bourne shell is exactly what M Desthuillers describes, for
the UNIX operating system. Python isn't at all - not that you can't
rename a file, change working directory, execute some processes and do
all that stuff from pure Python, but there's a clear difference in focus.
Those things the shell does rather well, at anything else it's pathetic.

Of course, scripting is naturally the domain of interpreted languages,
and most scripts are trivial in size and complexity. I guess this is
why for some people, "scripting language" just means "interpreted and
suited to writing trivial programs." It's hard to believe they're
thinking very hard about what they're saying, but so what's new?

Donn Cave, do**@drizzle.com
Feb 20 '06 #57
In article <eq************@lairds.us>, I wondered:
In article <yf*************@oc.ex.ac.uk>,
Alexander Schmolck <a.********@gmail.com> wrote:
.
.
.
However I don't find it at all implausible to assume that had Guido known all
the stuff that say, David Ungar and Guy Steele were aware of at the same time,
python would have come out not necessarily less dynamic but considerably
faster -- to its own detriment.

'as

Alexander, you've lost me. I *think* you're proposing that,
were Guido more knowledgeable, he would have created a Python
language that's roughly as we know now, implemented it with
FASTER software ... and "to its own detriment". Do you truly
believe that fewer people would use Python if its execution
were faster?


I think I can answer my own question: yes. Since posting, I came
across a different follow-up where Alexander explains that he sees
healthy elements of the Python ethos--focus on a reliable, widely-
used library, willingness to make Python-C partnerships, and so
on--as results at least in part of early acceptance of Python as
intrinsically slow. That's too interesting an argument for me to
respond without more thought.
Feb 20 '06 #58
On Mon, 20 Feb 2006 05:18:39 -0800, Kay Schluehr wrote:
What's far more interesting to me, however, is that I think there a good
reasons to suspect python's slowness is more of a feature than a flaw: I'd not
be suprised if on the whole it greatly increases programmer productivity and
results in clearer and more uniform code.


Yes, it's Guidos master-plan to lock programmers into a slow language
in order to dominate them for decades. Do you also believe that Al
Quaida is a phantom organization of the CIA founded by neocons in the
early '90s who planned to invade Iraq?


Of course not. The alternative, that Osama has been able to lug his
dialysis machine all over the Pakistan and Afghan mountains without being
detected for four years is *much* more believable. *wink*

I don't think it was the poster's implication that Guido deliberately
created a slow language for the sake of slowness. I think the implication
was more that Guido made certain design choices that increased
productivity and code clarity. (That much is uncontroversial.) Where the
poster has ruffled some feathers is his suggestion that if Guido had only
known more about the cutting edge of language design from CS, Python would
have been much faster, but also much less productive, clear and popular.

I guess the feather ruffling is because of the suggestion that Guido
merely _didn't_know_ about language features that would have increased
Python's speed at the cost of productivity, rather than deliberately
choose to emphasis productivity at the expense of some speed.

--
Steven.

Feb 20 '06 #59
Kay Schluehr wrote:
Yes, it's Guidos master-plan to lock programmers into a slow language
in order to dominate them for decades. Do you also believe that Al
Quaida is a phantom organization of the CIA founded by neocons in the
early '90s who planned to invade Iraq?


Actually, it was created by Bruce Lee, which is not dead but working
undercover for the Hong Kong police to fight against the chinese
triads. At this point you might guess what does Bruce Lee have to do
with Al Qaida? Well my friend, if you don't understand this, you don't
get it at all! (Hint: he's a c++ hacker who wanted to build a tracking
system for terrorists in middle east, but the project got halted when
Guido Van Rossum, who's real name is Abdul Al Wazari, convinced him to
use python). Now the system is so so slow that Ben Laden never gets
caught!

El Loco

Feb 20 '06 #60
Hallöchen!

cl****@lairds.us (Cameron Laird) writes:
In article <eq************@lairds.us>, I wondered:
[...] Do you truly believe that fewer people would use Python if
its execution were faster?


I think I can answer my own question: yes. Since posting, I came
across a different follow-up where Alexander explains that he sees
healthy elements of the Python ethos--focus on a reliable, widely-
used library, willingness to make Python-C partnerships, and so
on--as results at least in part of early acceptance of Python as
intrinsically slow. That's too interesting an argument for me to
respond without more thought.


I was rather stunned, too, when I read his line of thought.
Nevertheless, I think it's not pointless, albeit formulated in an
awkward way. Of course, Python has not been deliberately slowed
down.

I don't know how strong the effect is that a language design which
doesn't allow for (easy to implement) fast execution speed makes you
write cleaner code, however, I must say that I feel so, too. When I
came from C++ to Python I had to find my Pythonic style, and part of
it was that I threw away those ubiquitous little optimisations and
concentrated on formulating my idea in a clear and expressive way.

By the way, this is my main concern about optional static typing: It
may change the target group, i.e. it may move Python closer to those
applications where speed really matters, which again would have an
effect on what will be considered Pythonic.

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus ICQ 264-296-646
Feb 20 '06 #61
Steven D'Aprano wrote:
On Mon, 20 Feb 2006 05:18:39 -0800, Kay Schluehr wrote:
What's far more interesting to me, however, is that I think there a good
reasons to suspect python's slowness is more of a feature than a flaw: I'd not
be suprised if on the whole it greatly increases programmer productivity and
results in clearer and more uniform code.
Yes, it's Guidos master-plan to lock programmers into a slow language
in order to dominate them for decades. Do you also believe that Al
Quaida is a phantom organization of the CIA founded by neocons in the
early '90s who planned to invade Iraq?


Of course not. The alternative, that Osama has been able to lug his
dialysis machine all over the Pakistan and Afghan mountains without being
detected for four years is *much* more believable. *wink*


Osama? Who is Osama? A media effect, a CNN invention.
I don't think it was the poster's implication that Guido deliberately
created a slow language for the sake of slowness. I think the implication
was more that Guido made certain design choices that increased
productivity and code clarity. (That much is uncontroversial.) Where the
poster has ruffled some feathers is his suggestion that if Guido had only
known more about the cutting edge of language design from CS, Python would
have been much faster, but also much less productive, clear and popular.

I guess the feather ruffling is because of the suggestion that Guido
merely _didn't_know_ about language features that would have increased
Python's speed at the cost of productivity, rather than deliberately
choose to emphasis productivity at the expense of some speed.

--
Steven.


Alexanders hypothesis is completely absurd. It turned out over the
years that capabilities of Python optimization are lower than those of
Lisp and Smalltalk. But its a system effect and epiphenomenon of
certain design decisions. This might change with PyPy - who knows? The
Lisp/Smalltalk design is ingenious, radical and original and both
languages were considered too slow for many real-world-applications
over decades. But no one has ever claimed that Alan Kay intentionally
created a slow language in order to hold the herd together - and it
accidentally turned out to be reasonably fast with JIT technology in
the late 90s.

Smalltalk was killed by Java while Lisp was killed by the paradigm
shift to OO in the early 90s. The IT world has absolutely no interest
in the hobby horses of computer scientists or language lovers (like
me). It consolidates in direction of a small set of Algol successors
namely C,C++,Java and C# and some dynamically typechecked languages
like Perl, Python and Ruby that play nice with the bold mainstream
languages as their more flexible addition. A conspiracy like theory
used to explain what's going on is needless.

Kay

Feb 20 '06 #62
On Mon, 20 Feb 2006 16:54:34 +0000, Donn Cave wrote:
The reason this isn't just an abstruse philosophical argument where it
makes sense for us to obtusely cling to some indefensible point of view,
is that as the man points out, there are differences that we can't hide
forever from potential Python users. The most obvious to me is that
your Python program essential includes its interpreter - can't go anywhere
without it, and any change to the interpreter is a change to the program.
There are various strategies to address this, but pretending that Python
isn't interpreted is not one of them.


Python programs (.py files) don't contain an interpreter. Some of those
files are *really* small, you can't hide a full blown Python interpreter
in just half a dozen lines.

What you mean is that Python programs are only executable on a platform
which contains the Python interpreter, and if it is not already installed,
you have to install it yourself.

So how exactly is that different from the fact that my compiled C program
is only executable on a platform that already contains the correct machine
code interpreter, and if that interpreter is not already installed, I have
to install a machine code interpreter (either the correct CPU or a
software emulator) for it?

Moving compiled C programs from one system to another one with a different
CPU also changes the performance (and sometimes even the semantics!) of
the program. It is easier for the average user to change the version of
Python than it is to change the CPU.

Nobody denies that Python code running with no optimization tricks is
(currently) slower than compiled C code. That's a matter of objective
fact. Nobody denies that Python can be easily run in interactive mode.
Nobody denies that *at some level* Python code has to be interpreted.

But ALL code is interpreted at some level or another. And it is equally
true that at another level Python code is compiled. Why should one take
precedence over the other?

The current state of the art is that the Python virtual machine is slower
than the typical machine code virtual machine built into your CPU. That
speed difference is only going to shrink, possibly disappear completely.

But whatever happens in the future, it doesn't change two essential facts:

- Python code must be compiled to execute;
- machine code must be interpreted to execute.

Far from being indefensible, philosophically there is no difference
between the two. Python and C are both Turing Complete languages, and both
are compiled, and both are interpreted (just at different places).

Of course, the difference between theory and practice is that in theory
there is no difference between theory and practice, but in practice there
is. I've already allowed that in practice Python is slower than machine
code. But Python is faster than purely interpreted languages like bash.

Consider that Forth code can be as fast (and sometimes faster) than the
equivalent machine code despite being interpreted. I remember a
Apple/Texas Instruments collaborative PC back in the mid to late 1980s
with a Lisp chip. Much to their chagrin, Lisp code interpreted on a
vanilla Macintosh ran faster than compiled Lisp code running on their
expensive Lisp machine. Funnily enough, the Apple/TI Lisp machine sank
like a stone.

So speed in and of itself tells you nothing about whether a language is
interpreted or compiled. A fast interpreter beats a slow interpreter,
that's all, even when the slow interpreter is in hardware.

Describing C (or Lisp) as "compiled" and Python as "interpreted" is to
paint with an extremely broad brush, both ignoring what actually happens
in fact, and giving a false impression about Python. It is absolutely true
to say that Python does not compile to machine code. (At least not yet.)
But it is also absolutely true that Python is compiled. Why emphasise the
interpreter, and therefore Python's similarity to bash, rather than the
compiler and Python's similarity to (say) Java or Lisp?
--
Steven.

Feb 20 '06 #63
Torsten Bronger <br*****@physik.rwth-aachen.de> writes:
I was rather stunned, too, when I read his line of thought.
Nevertheless, I think it's not pointless, albeit formulated in an
awkward way. Of course, Python has not been deliberately slowed
down.


Indeed -- and I'm really not sure what defect in someone's reading skills or
my writing skills would make anyone think I suggested this.

'as
Feb 20 '06 #64
Torsten Bronger wrote:

By the way, this is my main concern about optional static typing: It
may change the target group, i.e. it may move Python closer to those
applications where speed really matters, which again would have an
effect on what will be considered Pythonic.


Yes, I think that with optional static typing, it's quite likely that
we would see lots of unnecessary declarations and less reusable code
("ints everywhere, everyone!"), so I think the point about not
providing people with certain features is a very interesting one, since
people have had to make additional and not insignificant effort to
optimise for speed. One potential benefit is that should better tools
than optional static typing be considered and evaluated, the "ints
everywhere!" line of thinking could prove to be something of a dead end
in all but the most specialised applications. Consequently, the Python
platform could end up better off, providing superior tools for
optimising performance whilst not compromising the feel of the language
and environment.

Paul

Feb 20 '06 #65
"Kay Schluehr" <ka**********@gmx.net> writes:
Alexanders hypothesis is completely absurd.
You're currently not in the best position to make this claim, since you
evidently misunderstood what I wrote (I certainly did not mean to suggest that
Guido *deliberately* chose to make python slow; quite the opposite in fact).

Maybe I wasn't sufficiently clear, so if rereading my original post doesn't
bring about enlightenment, I'll try a restatement.
It turned out over the years that capabilities of Python optimization are
lower than those of Lisp and Smalltalk. But its a system effect and
epiphenomenon of certain design decisions.
The point is that the design decisions, certainly for Common Lisp, scheme and
particularly for dylan where also informed by what could be done
*efficiently*, because the people who designed these languages knew a lot
about advanced compiler implementation strategies for dynamic languages and
thought that they could achieve high levels of expressiveness whilst retaining
the possibility of very fast implementations (IIRC dylan specifically was
meant to be something like within 90% of C performance). CL and dylan were
also specifically designed for building very large and sophisticated systems,
whereas it seems Guido originally thought that python would scale to about 500
LOC.
This might change with PyPy - who knows? The Lisp/Smalltalk design is
ingenious, radical and original and both languages were considered too slow
for many real-world-applications over decades. But no one has ever claimed
that Alan Kay intentionally created a slow language in order to hold the
herd together - and it accidentally turned out to be reasonably fast with
JIT technology in the late 90s.
I'm pretty sure this is wrong. Smalltalk and Lisp were both quite fast and
capable before JIT technology in the 90ies came along -- just not necessarily
on hardware optimized for C-like languages, partly because no one anticipated
that the x86 and co. would become so dominant (I also roughly remember Alan
Kay expressing his frustration not to long ago over the fact that despite a
50000 fold increase in processing speed, current hardware would only run early
smalltalk 100x faster than the lisa -- I almost certainly misremember the
details but you get the picture).
A conspiracy like theory used to explain what's going on is needless.


Indeed.

'as
Feb 20 '06 #66
63*******@sneakemail.com writes:
I'm wondering if someone can explain to me please what it is about
Python that is so different from Lisp that it can't be compiled into
something as fast as compiled Lisp? From this above website and
others, I've learned that compiled Lisp can be nearly as fast as C/C++,
so I don't understand why Python can't also eventually be as efficient?
Is there some *specific* basic reason it's tough?


The issues of compiling Python and compiling Lisp are similar. Lisp
implementers tend to care about performance more, so Lisp tends to be
compiled. There's a Python compiler called Psyco which can be used
with CPython and which will be part of PyPy. I'd expect its output
code to be comparable to compiled Lisp code.
Feb 21 '06 #67
Steven D'Aprano <st***@REMOVETHIScyber.com.au> writes:
efficient? Is there some *specific* basic reason it's tough? Or is it
that this type of problem in general is tough, and Lisp has 40+ years
vs Python's ~15 years?
It is by design.

Python is not slow by design. Python is dynamically typed by design, and
relative slowness is the trade-off that has to be made to give dynamic
types.


I think both of you are missing the point of the question, which is
that Lisp is dynamically typed exactly the way Python is and maps to
Python almost directly; yet good Lisp implementations are much faster
than CPython.
The Python developers have also done marvels at speeding up Python since
the early days, with the ultimate aim of the PyPy project to make Python
as fast as C, if not faster. In the meantime, the question people should
be asking isn't "Is Python fast?" but "Is Python fast enough?".


That is the real answer: CPython doesn't reach performance parity with
good Lisp implementations, but is still fast enough for lots of
purposes. Psyco and PyPy are ongoing efforts to close the performance
gap and which are showing promise of success.
Feb 21 '06 #68
"Michele Simionato" <mi***************@gmail.com> writes:
Alexander Schmolck wrote:
As common lisp and scheme demonstrate you can have high level of dynamism (and
in a number of things both are more dynamic than python) and still get very
good performance (in some cases close to or better than C).


Just for personal enlightment, where do you think Lisp is more dynamic
of Python?
Can you new name a few features?


Sure (I'm assuming that by lisp you mean common lisp):

- development model

- by default the development model is highly interactive and you can
redefine all sorts of stuff (functions, methods, classes even the syntax)
without having to start over from scratch (you can also save the current
state of affairs as an image). By contrast changing modules or classes in
interactive python sessions is rather tricky to do without screwing things
up, and it doesn't support images, so sessions tend to be much more
short-lived.

Emacs+slime also offers more powerful functionality than emacs+python mode
(autocompletion, jumping to definitions, various parallel sessions or
remotely connecting to running applications via sockets etc.)

- hot-patching

- partly as a consequence of the above, you can also arrange to hot-patch
some running application without too much trouble (IIRC Graham claims in
one of his essays that one of his favorite pranks was fixing a bug in the
application whilst on the phone to the customer as she reported it and
then telling her that everything in fact seemed to work fine and to try
again)

- error handling:

- by default you end up in the debugger if something goes wrong and in many
cases you can correct it in place and just continue execution (if you've
ever had some long numerical computation die on plotting the results
because of a underflow as you exponentiate to plot in transformed
coordinates, you'd appreciate being able to just return 0 and continue)

- Apart from "usual" exception handling CL has much more powerful resumable
exceptions that offer far more fine grained error handling possibiliies.

For more info let me recommend the chapter from Peter Seibel's excellent
"practical common lisp" (available in print, and also freely online):

<http://www.gigamonkeys.com/book/beyond-exception-handling-conditions-and-restarts.html>

(The whole book is a great ressource for people who want to have quick
play with common lisp and see how it's features can be leveraged for real
applications (such as html creation, or writing an mp3 server or id3
parser). Peter also makes an easy to install, preconfigured "lispbox"
bundle of Emacs+a free lisp implementation available on his web-page).

- OO:

- You know this already, but I'd argue that multimethods and method
combinations give you more dynamism then class-centric OO.

- Also, if you change the definition of a class all existing instances will
be updated automatically. You can get a similar effect in python by
laboriously mutating the class, provided it doesn't use __slots__ etc, but
that is more brittle and also more limited -- for example, in CL you can specify what
happens to instances when the class is updated.

- chameleon like nature:

It's much easier to make CL look like something completely different (say prolog
or some indentation based, class-centric language like python) than it would
be with python. In particular:

- there are no reserved keywords

- you can e.g. implement new syntax like python style """-strings easily
(I've done so in fact) with reader macros. Indentation based syntax should
also be possible along the same lines, although I haven't tried.

- you can introduce pretty much any sort of control structure you might
fancy and you can carry out very sophisticated code transformations behind
the scenes.

- finally, with Lispmachines there at least used to be whole operating systems
written all the way down in lisp and according to all the testimony you
could interactively modify quite basic system behaviour. The only extant
systems that come even remotely close would be smalltalks, but even squeak
which is incredibly malleable and a cross-platform mini-OS in its own right
uses the host OS for many basic tasks.
'as
Feb 21 '06 #69
Quoth Steven D'Aprano <st***@REMOVETHIScyber.com.au>:
....
| Nobody denies that Python code running with no optimization tricks is
| (currently) slower than compiled C code. That's a matter of objective
| fact. Nobody denies that Python can be easily run in interactive mode.
| Nobody denies that *at some level* Python code has to be interpreted.
|
| But ALL code is interpreted at some level or another. And it is equally
| true that at another level Python code is compiled. Why should one take
| precedence over the other?

I have no idea, what precedence? All I'm saying is that Python matches
what people think of as an interpreted language. You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long. If you
give me a Python program, you have 3 choices: cross your fingers and
hope that I have the required Python interpreter version, slip in a
25Mb Python interpreter install and hope I won't notice, or come clean
and tell me that your program needs an interpreter and I should check to
see that I have it.

Donn Cave, do**@drizzle.com
Feb 21 '06 #70
Donn Cave wrote:
Quoth Steven D'Aprano <st***@REMOVETHIScyber.com.au>:
...
| Nobody denies that Python code running with no optimization tricks is
| (currently) slower than compiled C code. That's a matter of objective
| fact. Nobody denies that Python can be easily run in interactive mode.
| Nobody denies that *at some level* Python code has to be interpreted.
|
| But ALL code is interpreted at some level or another. And it is equally
| true that at another level Python code is compiled. Why should one take
| precedence over the other?

I have no idea, what precedence?
There seem to be two positions in this argument:

The "Python is interpreted and not compiled" camp, who
appear to my eyes to dismiss Python's compilation stage
as a meaningless technicality.

The "Python is both interpreted and compiled" camp, who
believe that both steps are equally important, and to
raise one over the other in importance is misleading.

All I'm saying is that Python matches
what people think of as an interpreted language.
Most people in IT I know of still think of
"interpreted" as meaning that every line of source code
is parsed repeatedly every time the code is executed.
Even when they intellectually know this isn't the case,
old habits die hard -- they still think of
"interpreted" as second class.

If you think that Python has to parse the line "print
x" one hundred times in "for i in range(100): print x"
then you are deeply, deeply mistaken.

That's why Sun doesn't describe Java as interpreted,
but as byte-code compiled. They did that before they
had JIT compilers to compile to machine code.
Consequently nobody thinks of Java source having to be
parsed, and parsed, and parsed, and parsed again.

You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long.
Pot, meet kettle.

A simple question for you: does Python compile your
source code before executing it? If you need a hint,
perhaps you should reflect on what the "c" stands for
in .pyc files.

If you
give me a Python program, you have 3 choices: cross your fingers and
hope that I have the required Python interpreter version, slip in a
25Mb Python interpreter install and hope I won't notice, or come clean
and tell me that your program needs an interpreter and I should check to
see that I have it.


Hey Donn, here is a compiled program for the PowerPC,
or an ARM processor, or one of IBM's Big Iron
mainframes. Or even a Commodore 64. What do you think
the chances are that you can execute it on your
x86-compatible PC? It's compiled, it should just
work!!! Right?

No of course not. If your CPU can't interpret the
machine code correctly, the fact that the code is
compiled makes NO difference at all.

In other words, I have three choices:

- cross my fingers and hope that you have the required
interpreter (CPU);

- slip in an interpreter install (perhaps an emulator)
and hope you won't notice;

- or come clean and tell you that my program needs an
interpreter ("Hey Donn, do you have a Mac you can run
this on?") and you should check to see that you have it.

--
Steven.

Feb 21 '06 #71
Steven D'Aprano wrote:
The "Python is both interpreted and compiled" camp, who
believe that both steps are equally important, and to
raise one over the other in importance is misleading.
That's why Sun doesn't describe Java as interpreted,
but as byte-code compiled. They did that before they
had JIT compilers to compile to machine code.
Consequently nobody thinks of Java source having to be
parsed, and parsed, and parsed, and parsed again.


They also described it that way to help marketing, and I don't think
that should be overlooked. They would have known full well that calling
their language "interpreted" would have affected public perceptions.

It's interesting to see how culture affects things. You talked of 'IT
people' in your post and hold up Java of an example of how byte-code
doesn't mean slow, the implication being that Python uses the same
mechanisms as Java and therefore is good enough. In the general IT
world, Java is quite popular (to make a bit of an understatement) and
it would often be used as some sort of benchmark.

On the other hand, the backgrounds I have familiarity with are computer
game development and embedded development. In these areas, we would
point to Java as evidence that 'interpreted' bytecode is too slow and
that anything using a similar technology is likely to be a problem.

I'm not saying you're wrong, just highlighting that comparisons
themselves always sit in some wider context which can make the
comparison unimportant.

I think it's also important to note that 'interpreted' doesn't
necessarily mean 'parsed repeatedly'. Many older machines which came
with BASIC installed would store their statements in a tokenised form -
arguably bytecode with a large instruction set, if you look at it a
certain way. This isn't necessarily so far from what Python does, yet
few people would argue that those old forms of BASIC weren't
interpreted.

--
Ben Sizer

Feb 21 '06 #72

Alexander Schmolck wrote:
"Kay Schluehr" <ka**********@gmx.net> writes:
Alexanders hypothesis is completely absurd.
You're currently not in the best position to make this claim, since you
evidently misunderstood what I wrote (I certainly did not mean to suggest that
Guido *deliberately* chose to make python slow; quite the opposite in fact).


Like everyone else. It's sometimes hard extract the intended meaning in
particular if it's opposed to the published one. I apologize if I
overreacted.

Maybe I wasn't sufficiently clear, so if rereading my original post doesn't
bring about enlightenment, I'll try a restatement.
It turned out over the years that capabilities of Python optimization are
lower than those of Lisp and Smalltalk. But its a system effect and
epiphenomenon of certain design decisions.


The point is that the design decisions, certainly for Common Lisp, scheme and
particularly for dylan where also informed by what could be done
*efficiently*, because the people who designed these languages knew a lot
about advanced compiler implementation strategies for dynamic languages and
thought that they could achieve high levels of expressiveness whilst retaining
the possibility of very fast implementations (IIRC dylan specifically was
meant to be something like within 90% of C performance). CL and dylan were
also specifically designed for building very large and sophisticated systems,
whereas it seems Guido originally thought that python would scale to about 500
LOC.


O.K. To repeat it in an accurate manner. Python was originally designed
by Guido to be a scripting language for a new OS as a more complete
version of a shell scripting language. Unlike those its design was
strongly influenced by the usability ideas of the ABC development team.
Therefore speed considerations were not the primary concern but an open
model that was easily extendable both on the C-API level and the
language level. So a VM architecture was chosen to achieve this. Adding
new opcodes should have been as simple as interfacing with the C-API.
After growing strongly in the late 90s large scale projects emerged
such as Zope and many users started to request more Python performance
since they wanted to escape from the dual-language model. Writing
C-code was not self evident for a new programmers generation grewn up
with Java and the ffi turned out to be a hurdle. After remodeling the
object core ( "new style classes" ) progressive optimizations came to
hold. In 2002 a new genius programmer entered the scene, namely Armin
Rigo, who came up with Psyco and launched the PyPy project together
with a few other Python hackers in order to aggressively optimize
Python using Pythons introspective capabilities. That's were we still
are.

Remembering the historical context we might draw some parallels to
other contexts and language design intentions. We might also figure out
parallels and differences between motives of language designers and
leading persons who drive language evolution. Python is not just Guido
although his signature is quite pervasive. In his latest musings he
comes back to his central idea of language design as a kind of user
interface design. It's probably this shift in perspective that can be
attributed as original to him and which goes beyond making things just
"simple" or "powerfull" or "efficient" ( at least he made this shift
public and visible ). It is also the most controversial aspect of the
language because it is still inseparable from technical decisions (
non-true closures, explicit self, statement-expression distinction,
anonymous function as an expression with limited abilities etc. )

Kay

Feb 21 '06 #73
On 2/20/06, Donn Cave <do**@drizzle.com> wrote:
Quoth Steven D'Aprano <st***@REMOVETHIScyber.com.au>:
...
| Nobody denies that Python code running with no optimization tricks is
| (currently) slower than compiled C code. That's a matter of objective
| fact. Nobody denies that Python can be easily run in interactive mode.
| Nobody denies that *at some level* Python code has to be interpreted.
|
| But ALL code is interpreted at some level or another. And it is equally
| true that at another level Python code is compiled. Why should one take
| precedence over the other?

I have no idea, what precedence? All I'm saying is that Python matches
what people think of as an interpreted language. You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long. If you
give me a Python program, you have 3 choices: cross your fingers and
hope that I have the required Python interpreter version, slip in a
25Mb Python interpreter install and hope I won't notice, or come clean
and tell me that your program needs an interpreter and I should check to
see that I have it.

You're correct as far as it goes, but can you provide a reasonable
definition for "interpreted" that matches the common usage? Most
people can't.

When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...

They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.

Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.

There is an obvious difference between Python and C. Nobody would deny
that. But it's a fairly hard thing to *quantify*, which is why people
make sloppy categorizations. That's not a problem as long as there
isn't prejudice associated with the categorization, which there is.

I wonder how "interpreted" people would think Python is if the
automagic compilation to .pyc was removed and you had to call
"pythonc" first.

Donn Cave, do**@drizzle.com
--
http://mail.python.org/mailman/listinfo/python-list

Feb 21 '06 #74
Chris Mellon wrote:

You're correct as far as it goes, but can you provide a reasonable
definition for "interpreted" that matches the common usage? Most
people can't.
I thought Torsten's definition was good enough: if the instructions
typically produced when preparing your programs for execution can be
handled directly by the CPU then let's call it a "compiled language";
otherwise, let's call it an "interpreted language". I think we all know
about the subtleties of different levels of virtual machines, but if
you want an arbitrary definition that lots of people feel is intuitive
then that's the one to go for.
When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...
Right: compiled Perl and Python instructions typically aren't executed
directly by the hardware; Ruby visits the parse tree when executing
programs (see [1] for some casual usage of "interpreted" and "compiled"
terms in this context), although other virtual machines exist [2];
JavaScript varies substantially, but I'd imagine that a lot of the
implementations also do some kind of parse tree walking (or that the
developers don't feel like documenting their bytecodes), although you
can also compile JavaScript to Java class files [3]; BASIC varies too
much for any kind of useful summary here, but I'd imagine that early
implementations have tainted the language's "compiled" reputation
substantially.
They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.
That's why I put Java and Python in the same category elsewhere in this
thread. Bear in mind, though, that Java's just-in-time compilation
features were hyped extensively, and I imagine that many or most
implementations have some kind of native code generation support,
either just-in-time or ahead-of-time.
Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.
Well, I think Torsten's definition was more objective and yet arrives
at the same result. Whether we're happy with that result, I have my
doubts. ;-)
There is an obvious difference between Python and C. Nobody would deny
that. But it's a fairly hard thing to *quantify*, which is why people
make sloppy categorizations. That's not a problem as long as there
isn't prejudice associated with the categorization, which there is.
I refer you again to Torsten's definition.
I wonder how "interpreted" people would think Python is if the
automagic compilation to .pyc was removed and you had to call
"pythonc" first.


Well, such things might have a psychological impact, but consider
removing Python's interactive mode in order to enhance Python's
non-interpreted reputation, and then consider Perl (an interpreted
language according to the now-overly-referenced definition) which
doesn't have an interactive mode (according to [4] - I don't keep up
with Perl matters, myself), but which allows expression evaluation at
run-time. No-one would put Perl together with C in a compiled vs.
interpreted categorisation. Removing the automatic compilation support
might strengthen the compiled feel of the both languages further, but
with knowledge of the technologies employed, both languages (at least
in their mainstream forms) are still on the other side of the fence
from C.

Paul

[1] http://www.rubygarden.org/faq/entry/show/126
[2] http://www.atdot.net/yarv/
[3] http://www.mozilla.org/rhino/doc.html
[4] http://dev.perl.org/perl6/rfc/184.html

Feb 21 '06 #75
On Tue, 21 Feb 2006 08:36:50 -0600 in comp.lang.python, "Chris Mellon"
<ar*****@gmail.com> wrote:

[...]

When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...

They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.
IMHO, it's marketing. Soon after (as soon as?) Sun introduced Java,
they announced microprocessors that would implement the JVM natively.
Thus on those machines, Java would not be "interpreted."

AIUI, the reason native Java chips never took off is 1) limited
utility (who wants a chip that can only run Java programs?), and 2)
performance on native chips wasn't even better than JVMs running on
commodity microprocessors, so what's the point?

Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.


I think another reason "perl, Python etc." are known to be interpreted
and Java is not is the interactivity afforded by former group. This
is also why, e.g., lisp and Forth are thought of as interpreted (at
least by those with only a passing familiarity with the languages),
though native compilers for both languages are readily available.

Regards,
-=Dave

--
Change is inevitable, progress is not.
Feb 21 '06 #76
D H
Donn Cave wrote:
I can say "Python can serve as a scripting language for some applications",
but not "Python is a scripting language!"
bruno at modulix wrote: as soon as you say "interpreted, scripting", peoples think "not
serious".
Cameron Laird wrote: I *think* you're proposing that,
were Guido more knowledgeable, he would have created a Python
language that's roughly as we know now, implemented it with
FASTER software ... and "to its own detriment".
Fredrik Lundh wrote: define "scripting language".

the only even remotely formal definition I've ever seen is "language
with designed to script an existing application, with limited support
for handling
its own state". Early Tcl and JavaScript are scripting languages,
Python is not.
Kay Schluehr wrote: Yes, it's Guidos master-plan to lock programmers into a slow language
in order to dominate them for decades.
Donn Cave wrote: All I'm saying is that Python matches
what people think of as an interpreted language. You can deny it, but
but it's going to look like you're playing games with words, and to no
real end, since no one could possibly be deceived for very long.
Steven D'Aprano wrote: Describing C (or Lisp) as "compiled" and Python as "interpreted" is to
paint with an extremely broad brush, both ignoring what actually
happens in fact, and giving a false impression about Python. It is
absolutely true to say that Python does not compile to machine code.
(At least not yet.) But it is also absolutely true that Python is
compiled. Why emphasise the interpreter, and therefore Python's
similarity to bash, rather than the compiler and Python's similarity
to (say) Java or Lisp?
Paul Boddie wrote: Yes, I think that with optional static typing, it's quite likely that
we would see lots of unnecessary declarations and less reusable code
("ints everywhere, everyone!"), so I think the point about not
providing people with certain features is a very interesting one,
since
people have had to make additional and not insignificant effort to
optimise for speed. One potential benefit is that should better tools
than optional static typing be considered and evaluated, the "ints
everywhere!" line of thinking could prove to be something of a dead
end
in all but the most specialised applications. Consequently, the Python
platform could end up better off, providing superior tools for
optimising performance whilst not compromising the feel of the
language
and environment.
Torsten Bronger wrote: By the way, this is my main concern about optional static typing: It
may change the target group, i.e. it may move Python closer to those
applications where speed really matters, which again would have an
effect on what will be considered Pythonic.
Steven D'Aprano wrote: The "Python is both interpreted and compiled" camp, who
believe that both steps are equally important, and to
raise one over the other in importance is misleading.
That's why Sun doesn't describe Java as interpreted,
but as byte-code compiled. They did that before they
had JIT compilers to compile to machine code.
Bruno Desthuilliers wrote: It's not a "scripting" language, and it's not interpreted.

It will all be sorted out once and for all in Python 3000: The Reckoning
Feb 21 '06 #77
In article <43**************@REMOVEMEcyber.com.au>,
Steven D'Aprano <st***@REMOVEMEcyber.com.au> wrote:
....
Hey Donn, here is a compiled program for the PowerPC,
or an ARM processor, or one of IBM's Big Iron
mainframes. Or even a Commodore 64. What do you think
the chances are that you can execute it on your
x86-compatible PC? It's compiled, it should just
work!!! Right?

No of course not. If your CPU can't interpret the
machine code correctly, the fact that the code is
compiled makes NO difference at all.

In other words, I have three choices:

- cross my fingers and hope that you have the required
interpreter (CPU);

- slip in an interpreter install (perhaps an emulator)
and hope you won't notice;

- or come clean and tell you that my program needs an
interpreter ("Hey Donn, do you have a Mac you can run
this on?") and you should check to see that you have it.


Sure, all this is true, except for the term "interpreter."
You would surely not use the word that way, unless you
just didn't want to communicate.

Your paragraph above that starts with "No of course not",
even omits a point that everyone understands, you can in
fact expect a .py file will work independent of machine
architecture - like any interpreted language. We all know
what native code compilation buys you and what it doesn't.

Donn Cave, do**@u.washington.edu
Feb 21 '06 #78
On 21 Feb 2006 08:30:04 -0800, Paul Boddie <pa**@boddie.org.uk> wrote:
Chris Mellon wrote:

You're correct as far as it goes, but can you provide a reasonable
definition for "interpreted" that matches the common usage? Most
people can't.
I thought Torsten's definition was good enough: if the instructions
typically produced when preparing your programs for execution can be
handled directly by the CPU then let's call it a "compiled language";
otherwise, let's call it an "interpreted language". I think we all know
about the subtleties of different levels of virtual machines, but if
you want an arbitrary definition that lots of people feel is intuitive
then that's the one to go for.
When asked to name some interpreted (or scripting) languages, they'll
name some off - perl, python, ruby, javascript, basic...


Right: compiled Perl and Python instructions typically aren't executed
directly by the hardware; Ruby visits the parse tree when executing
programs (see [1] for some casual usage of "interpreted" and "compiled"
terms in this context), although other virtual machines exist [2];
JavaScript varies substantially, but I'd imagine that a lot of the
implementations also do some kind of parse tree walking (or that the
developers don't feel like documenting their bytecodes), although you
can also compile JavaScript to Java class files [3]; BASIC varies too
much for any kind of useful summary here, but I'd imagine that early
implementations have tainted the language's "compiled" reputation
substantially.
They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.


That's why I put Java and Python in the same category elsewhere in this
thread. Bear in mind, though, that Java's just-in-time compilation
features were hyped extensively, and I imagine that many or most
implementations have some kind of native code generation support,
either just-in-time or ahead-of-time.


Early Java versions did not, and many versions still don't, at least
in any meaningful way. There are ways of compiling "native" Java, but
they work more like py2exe than GCC. "Native code generation" is a
fairly imprecise term in and of itself - Psyco works almost exactly
the same way as Java JIT does,
Probably the most accurate definition of "interpreted" as it is used
in the wild is "one of these languages: perl, python, perl, ruby,
etc". That is, you're essentially claiming that Python is interpreted
because everyone thinks of it that way, technical correctness be
damned.


Well, I think Torsten's definition was more objective and yet arrives
at the same result. Whether we're happy with that result, I have my
doubts. ;-)


I don't think it does, though. Firstly, as a definition it relies on
the environment the application will be running under and therefore
can't be considered to describe just a language. Secondly, by that
definition Java is an interpreted language which is at odds with the
common definition.

I've encountered a C scripting environment that works by using GCC to
compile each line as it is encountered, doing some magic to keep a
working compilation environment around.

Interpreted? Compiled?
There is an obvious difference between Python and C. Nobody would deny
that. But it's a fairly hard thing to *quantify*, which is why people
make sloppy categorizations. That's not a problem as long as there
isn't prejudice associated with the categorization, which there is.


I refer you again to Torsten's definition.


Torstens definition isn't useful for quantifying a difference between
interpeted and compiled - it's a rough sort of feel-test. It's like
how much of a naked body you can expose before before it changes from
art to pornography - it's not something that is easily quantified.
I wonder how "interpreted" people would think Python is if the
automagic compilation to .pyc was removed and you had to call
"pythonc" first.


Well, such things might have a psychological impact, but consider
removing Python's interactive mode in order to enhance Python's
non-interpreted reputation, and then consider Perl (an interpreted
language according to the now-overly-referenced definition) which
doesn't have an interactive mode (according to [4] - I don't keep up
with Perl matters, myself), but which allows expression evaluation at
run-time. No-one would put Perl together with C in a compiled vs.
interpreted categorisation. Removing the automatic compilation support
might strengthen the compiled feel of the both languages further, but
with knowledge of the technologies employed, both languages (at least
in their mainstream forms) are still on the other side of the fence
from C.

Paul

[1] http://www.rubygarden.org/faq/entry/show/126
[2] http://www.atdot.net/yarv/
[3] http://www.mozilla.org/rhino/doc.html
[4] http://dev.perl.org/perl6/rfc/184.html

--
http://mail.python.org/mailman/listinfo/python-list

Feb 21 '06 #79
In article <ma***************************************@python. org>,
"Chris Mellon" <ar*****@gmail.com> wrote:
....
They won't say Java. Ask them why Python is interpreted and Java isn't
and you'll have a hard time getting a decent technical answer, because
Python isn't all that different from Java in that regard, especially
pre-JIT versions of Java.


For me that would be partly because I don't know that
much about Java, honestly. Just searching at random
for something about the subject, I cam across this -
http://www-128.ibm.com/developerwork...ive.html?loc=j
- which seems like it might be of some interest here.

My impression from reading this is that Java actually
can be compiled to native code, though in 2002 this
was relatively new.

Donn Cave, do**@u.washington.edu
Feb 21 '06 #80
Chris Mellon wrote:
[...]
Torstens definition isn't useful for quantifying a difference between
interpeted and compiled - it's a rough sort of feel-test. It's like
how much of a naked body you can expose before before it changes from
art to pornography - it's not something that is easily quantified.

[...]

Possibly, but if your aim is exposing as much flesh as possible without
being labeled pornography I think I'd conclude you were in the
pornography business from the start, albeit masquerading as an "art dealer".

regards
Steve
--
Steve Holden +44 150 684 7255 +1 800 494 3119
Holden Web LLC www.holdenweb.com
PyCon TX 2006 www.python.org/pycon/

Feb 21 '06 #81
> As they say, case is the difference between "I helped my
Uncle Jack off a horse" and "I helped my uncle jack off a horse."


Hahaha!... never heard of that though

Feb 21 '06 #82
On Tue, 21 Feb 2006 09:46:27 -0800, Donn Cave wrote:
In article <43**************@REMOVEMEcyber.com.au>,
Steven D'Aprano <st***@REMOVEMEcyber.com.au> wrote:
...
Hey Donn, here is a compiled program for the PowerPC,
or an ARM processor, or one of IBM's Big Iron
mainframes. Or even a Commodore 64. What do you think
the chances are that you can execute it on your
x86-compatible PC? It's compiled, it should just
work!!! Right?

No of course not. If your CPU can't interpret the
machine code correctly, the fact that the code is
compiled makes NO difference at all.

[snip for brevity]
Sure, all this is true, except for the term "interpreter."
You would surely not use the word that way, unless you
just didn't want to communicate.
Do you honestly believe that the CPU doesn't have to interpret the machine
code, or are you just deliberately playing silly buggers with language?

In modern CPUs, there is an intermediate layer of micro-code between the
machine code your C compiler generates and the actual instructions
executed in hardware. But even if we limit ourselves to obsolete hardware
without micro-code, I ask you think about what an interpreter does, and
what the CPU does, in the most general way possible.

Both take a stream of instructions. Both have to take each instruction,
and execute it. In both cases the link between the instruction and the
effect is indirect: for example, the machine code 00000101 on the
Zilog Z80 processor causes the CPU to decrement the B processor register.
In assembly language this would be written as DEC B. There is absolutely
nothing fundamental about the byte value 5 that inherently means
"decrement B processor register".

In other words, machine language is a language, just like it says, and
like all languages, it must be interpreted.
Your paragraph above that starts with "No of course not",
even omits a point that everyone understands, you can in
fact expect a .py file will work independent of machine
architecture - like any interpreted language.
Amazing. In your previous post you were telling everybody how the
*disadvantage* of interpreted programs is that they won't run unless the
interpreter is present, and in this post you are telling us that
interpreted languages will just work. What happened to the requirement for
an interpreter?

Let's see you run that Python program on a Zilog Z80 without a Python
interpreter. Can't be done. No interpreter, whether in hardware or
software, and the program won't run, whether in source code or byte code
or machine code.

If I allow that the machines have an interpreter, perhaps you'll return
the favour and install an interpreter for machine language (often called
an emulator). Now your compiled C or Lisp code also will run independent
of machine architecture.

In order to force "interpreted language" and "compiled language" into two
distinct categories, rather than just two overlapping extremes of a single
unified category, you have to ignore reality. You ignore interpreted
languages that are compiled, you ignore the reality of how machine code is
used in the CPU, you ignore the existence of emulators, and you ignore
virtual machines.

We all know
what native code compilation buys you and what it doesn't.


Did you fail to learn *anything* from my parable of interpreted Lisp on a
Macintosh II running faster than compiled Lisp running on a Mac II fitted
with a Lisp processor?
--
Steven

Feb 21 '06 #83
Chris Mellon wrote:
[snip]
I don't think it does, though. Firstly, as a definition it relies on
the environment the application will be running under and therefore
can't be considered to describe just a language. Secondly, by that
definition Java is an interpreted language which is at odds with the
common definition.

I've encountered a C scripting environment that works by using GCC to
compile each line as it is encountered, doing some magic to keep a
working compilation environment around.

Interpreted? Compiled?


There is also the wonderful C interpreter cint:

http://root.cern.ch/root/Cint.html

so obviously C must be an interpreted language :-)

Cheers,

Carl Friedrich Bolz

Feb 22 '06 #84
Torsten Bronger wrote:

My definiton would be that an interpreted language has in its
typical implementation an interpreting layer necessary for typical
hardware. Of couse, now we could discuss what is "typical",
however, in practice one would know it, I think. In case of Python:
CPython and all important modern processors.


In a previous century, I used something called UCSD Pascal, which at the
time was a typical implementation of Pascal. It ran on (amongst other
things) an Apple ][, which at the time was typical hardware. It worked
by compiling Pascal source to bytecode (called p-code), and interpreting
the p-code. So, in practice, one would know that Pascal was an
interpreted language.

Later on, I used a typical implementation called VAX Pascal: a compiler
reduced Pascal source to VAX object code. In practice, Pascal was not an
interpreted language. Of course, more than one of the VAXen we had did
not implement the entire VAX instruction set, and some instructions were
emulated, or interpreted, if you will, by other VAX instructions. So, in
practice, some of the Pascal was interpreted.

And, as someone in this thread has pointed out, it is likely that your
important modern (x86) processor is not natively executing your x86
code, and indeed meets your definition of having "in its typical
implementation an interpreting layer necessary for typical hardware".

Another example: is Java the bytecode, which is compiled from Java the
language, interpreted or not? Even when the HotSpot JIT cuts in? Or when
a native Java processor is used? Or when your Java program is compiled
with GCJ (if GCJ does what I think it does)? Does this make Java an
interpreted language or not?

Personally, in practice I don't care, so don't ask me. Ponder on getting
angels to dance on the head of a pin before you worry about whether the
dance can be interpreted or not.

PJDM
Feb 22 '06 #85
Quoth Steven D'Aprano <st***@REMOVETHIScyber.com.au>:
....
| Do you honestly believe that the CPU doesn't have to interpret the machine
| code, or are you just deliberately playing silly buggers with language?

I don't care whether the CPU has to interpret machine code. Are
you suggesting that we might in normal conversation wish to use
the term interpreter to mean CPU, like "what kind of interpreter
does your computer have?", that kind of thing?

| > Your paragraph above that starts with "No of course not",
| > even omits a point that everyone understands, you can in
| > fact expect a .py file will work independent of machine
| > architecture - like any interpreted language.
|
| Amazing. In your previous post you were telling everybody how the
| *disadvantage* of interpreted programs is that they won't run unless the
| interpreter is present, and in this post you are telling us that
| interpreted languages will just work. What happened to the requirement for
| an interpreter?

Look, this is my last post on this matter, because you have evidently
reached a point where every statement has to be spelled out in excruciating
detail to avoid absurd interpretations. "will work independent of machine
architecture" does not declare that it is absolutely guaranteed to work -
after all, it may have some other flaw that will prevent it from working
anywhere. It just says that if it doesn't work, it isn't because it
tried to execute on the wrong machine architecture - the file is machine
architecture independent. You know that, you know I know that. What
is the fucking problem?

| In order to force "interpreted language" and "compiled language" into two
| distinct categories, rather than just two overlapping extremes of a single
| unified category, you have to ignore reality. You ignore interpreted
| languages that are compiled, you ignore the reality of how machine code is
| used in the CPU, you ignore the existence of emulators, and you ignore
| virtual machines.

Anyone with an interest in computer programming is likely to know what
microcode means, that there are emulators, virtual machines, etc. You
might find the UCSD Pascal system interesting, to harken back to the
early days of my experience with computers, a fascinating twist on the
interpreted/compiled story. Interesting as perspective, but it wouldn't
change the way we apply these words to Python.

Donn Cave, do**@drizzle.com
Feb 22 '06 #86
Hallöchen!

Peter Mayne <Pe*********@hp.com> writes:
Torsten Bronger wrote:
My definiton would be that an interpreted language has in its
typical implementation an interpreting layer necessary for typical
hardware. Of couse, now we could discuss what is "typical",
however, in practice one would know it, I think. In case of Python:
CPython and all important modern processors.
In a previous century, I used something called UCSD Pascal, which at
the time was a typical implementation of Pascal.


Not "a" typical implementation but "its".
[...]

And, as someone in this thread has pointed out, it is likely that
your important modern (x86) processor is not natively executing
your x86 code, and indeed meets your definition of having "in its
typical implementation an interpreting layer necessary for typical
hardware".
Only if you deliberately misunderstand me.
Another example: is Java the bytecode, which is compiled from Java
the language, interpreted or not? Even when the HotSpot JIT cuts
in?
It is partly interpreted and partly compiled. That's why it's
faster than Python.
[...]

Personally, in practice I don't care, so don't ask me. Ponder on
getting angels to dance on the head of a pin before you worry
about whether the dance can be interpreted or not.


I agree that the term "interpreted" is bad Python advocacy because
its implications are often misunderstood. However, I think that
it's fair to make a distiction between compiled and interpreted
languages because it may affect one's decision for one or the other.
Although I'm surely not ingenious, I can make this distinction.

The reason why Python is slower than C is because there is an
interpreting layer that C doesn't have. And the reason for this is
that Python's nature is incompatible with today's CPUs (which was a
deliberate and advantageous design decision). I'm sure that a
willing CS person could define this more clearly.

Anyway, if we drop "scripting" and drop "interpreted", what do you
want to tell people asking why Python is so slow? Because it is
dynamic? ;-)

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus ICQ 264-296-646
Feb 22 '06 #87
On Wed, 22 Feb 2006 10:15:21 +0100, Torsten Bronger wrote:
And, as someone in this thread has pointed out, it is likely that
your important modern (x86) processor is not natively executing
your x86 code, and indeed meets your definition of having "in its
typical implementation an interpreting layer necessary for typical
hardware".
Only if you deliberately misunderstand me.


If the words you choose to use have implications which you failed to
realise before saying them, don't blame the reader for spotting those
implications.
Another example: is Java the bytecode, which is compiled from Java the
language, interpreted or not? Even when the HotSpot JIT cuts in?


It is partly interpreted and partly compiled. That's why it's faster
than Python.


But Python is partly interpreted and partly compiled too, so that can't be
the answer.

I think we all know what the answer is. The Python interpreter isn't as
fast as the Java interpreter, or most machine code interpreters built into
hardware.

On the other hand, I'm pretty sure that interpreted Python runs faster on
my current PC than compiled code runs on the 20 year old Macintosh in my
cupboard. So "compiled" isn't a magic wand that makes code run faster.
[...]

Personally, in practice I don't care, so don't ask me. Ponder on
getting angels to dance on the head of a pin before you worry about
whether the dance can be interpreted or not.


I agree that the term "interpreted" is bad Python advocacy because its
implications are often misunderstood. However, I think that it's fair
to make a distiction between compiled and interpreted languages because
it may affect one's decision for one or the other. Although I'm surely
not ingenious, I can make this distinction.


Would you rather use a blindingly fast interpreted language, or a
slow-as-continental drift compiled one?

This isn't a rhetorical question. In the mid-80s, Apple and Texas
Instruments collaborated on a Macintosh II computer with a Lisp
coprocessor. The problem was, according to benchmarks at the time, Lisp
compiled and run natively on the coprocessor was actually slower than Lisp
interpreted on a standard Macintosh II.

I'm sure that's hardly the only example of a speedy interpreted language
beating a glacial compiled one.

The reason why Python is slower than C is because there is an
interpreting layer that C doesn't have.
The primary reason Python is slower than C is because C compilers have
been optimized to create fast code, while Python has been created to
optimize programmer productivity instead. That means that a simple
instruction like x + y does a lot more work in Python than it does in C.

There are other languages that, like Python, are dynamic, interpreted,
interactive and the rest, and they execute faster than Python. (By the
same token, there are also some that execute slower than Python.) Let's be
honest here: it isn't that Python can't be as fast as C, it is that the
Python Dev team have had other priorities.

But over time, as PyPy, Psycho, and other technologies bear fruit, Python
will speed up, even though it will remain interpreted.

And the reason for this is that
Python's nature is incompatible with today's CPUs (which was a
deliberate and advantageous design decision). I'm sure that a willing
CS person could define this more clearly.

Anyway, if we drop "scripting" and drop "interpreted", what do you want
to tell people asking why Python is so slow? Because it is dynamic? ;-)


Who says Python is so slow? I've just got Python to count from 0 up to
100,000, and it only took 7 milliseconds. That's at least 12 milliseconds
faster than I can count on my fingers.

But seriously... why not tell them the truth? Python is slower than some
other languages because optimization for execution speed has not been the
primary focus of Python's development. If you tell them that Python is
slow because it is interpreted, they will believe that Python will always
be slow. If you tell them that Python is slow because speed has not been
the priority, they will believe that some day it will become the priority,
and then Python will get faster. And they will be right. That is the aim
of PyPy after all.


--
Steven.

Feb 22 '06 #88

Steven D'Aprano wrote:
But over time, as PyPy, Psycho, and other technologies bear fruit, Python
will speed up, even though it will remain interpreted.


I talked to Richard Emslie recently and he told me that the PyPy team
works on a mechanism to create CPython-extension modules written in
RPython i.e. a statically translateable subset of Python. So even
without dynamic code specialization there will be an optimization path
based on the PyPy toolchain that is amazing.

Kay

Feb 22 '06 #89
"Kay Schluehr" <ka**********@gmx.net> writes:
I talked to Richard Emslie recently and he told me that the PyPy team
works on a mechanism to create CPython-extension modules written in
RPython i.e. a statically translateable subset of Python. So even
without dynamic code specialization there will be an optimization path
based on the PyPy toolchain that is amazing.


Sounds great but is that a whole lot different from pyrex?
Feb 22 '06 #90

Paul Rubin wrote:
"Kay Schluehr" <ka**********@gmx.net> writes:
I talked to Richard Emslie recently and he told me that the PyPy team
works on a mechanism to create CPython-extension modules written in
RPython i.e. a statically translateable subset of Python. So even
without dynamic code specialization there will be an optimization path
based on the PyPy toolchain that is amazing.


Sounds great but is that a whole lot different from pyrex?


RPython is Python code not a different language. In a sense RPython
consists of a set of rules usual Python has to conform to make complete
type-inference feasible. Here is an overview of those rules.

http://codespeak.net/pypy/dist/pypy/...tricted-python

Kay

Feb 22 '06 #91
Hallöchen!

Steven D'Aprano <st***@REMOVETHIScyber.com.au> writes:
On Wed, 22 Feb 2006 10:15:21 +0100, Torsten Bronger wrote:
And, as someone in this thread has pointed out, it is likely
that your important modern (x86) processor is not natively
executing your x86 code, and indeed meets your definition of
having "in its typical implementation an interpreting layer
necessary for typical hardware".
Only if you deliberately misunderstand me.


If the words you choose to use have implications which you failed
to realise before saying them, don't blame the reader for spotting
those implications.


To me it sounds like "Some hardware is harder than other".
Another example: is Java the bytecode, which is compiled from
Java the language, interpreted or not? Even when the HotSpot JIT
cuts in?


It is partly interpreted and partly compiled. That's why it's
faster than Python.


But Python is partly interpreted and partly compiled too


It's byte-compiled for a VM, that's not the same, and you know it.
Sorry but I think we've exchanged all arguments that are important.
Any further comment from me would be redundant, so I leave it.
[...]
[...] However, I think that it's fair to make a distiction
between compiled and interpreted languages because it may affect
one's decision for one or the other. [...]
Would you rather use a blindingly fast interpreted language, or a
slow-as-continental drift compiled one?

This isn't a rhetorical question. [example]

I'm sure that's hardly the only example of a speedy interpreted
language beating a glacial compiled one.


I agree that the distinction between interpreted and compiled
languages is not as clear as between positiv and negative numbers,
however, neither anybody has claimed that so far, nor it is
necessary. It must be *practical*, i.e. a useful rule of thumb for
decision making. If you really know all implications (pros and
cons) of interpreted languages, it's are very useful rule in my
opinion.
[...]

But seriously... why not tell them the truth? Python is slower
than some other languages because optimization for execution speed
has not been the primary focus of Python's development. If you
tell them that Python is slow because it is interpreted, they will
believe that Python will always be slow.


I don't think that "Python's developers don't focus primarily on
speed" sounds better than "Python is interpreted". Both suggests
that you must circumvent problems with Python's execution speed (we
all know that this works and how) because you can't count on
language improvements. Even worse, evading "interpreted" may sound
like an euphemism and as if you want to hide unloved implementation
features, depending on your audience.

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus ICQ 264-296-646
Feb 22 '06 #92
Hallöchen!

Steven D'Aprano <st***@REMOVETHIScyber.com.au> writes:
On Wed, 22 Feb 2006 10:15:21 +0100, Torsten Bronger wrote:
And, as someone in this thread has pointed out, it is likely
that your important modern (x86) processor is not natively
executing your x86 code, and indeed meets your definition of
having "in its typical implementation an interpreting layer
necessary for typical hardware".
Only if you deliberately misunderstand me.


If the words you choose to use have implications which you failed
to realise before saying them, don't blame the reader for spotting
those implications.


To me it sounds like "Some hardware is harder than other".
Another example: is Java the bytecode, which is compiled from
Java the language, interpreted or not? Even when the HotSpot JIT
cuts in?


It is partly interpreted and partly compiled. That's why it's
faster than Python.


But Python is partly interpreted and partly compiled too


It's byte-compiled for a VM, that's not the same, and you know it.
Sorry but I think we've exchanged all arguments that are important.
Any further comment from me would be redundant, so I leave it.
[...]
[...] However, I think that it's fair to make a distiction
between compiled and interpreted languages because it may affect
one's decision for one or the other. [...]
Would you rather use a blindingly fast interpreted language, or a
slow-as-continental drift compiled one?

This isn't a rhetorical question. [example]

I'm sure that's hardly the only example of a speedy interpreted
language beating a glacial compiled one.


I agree that the distinction between interpreted and compiled
languages is not as clear as between positiv and negative numbers,
however, neither anybody has claimed that so far, nor it is
necessary. It must be *practical*, i.e. a useful rule of thumb for
decision making. If you really know all implications (pros and
cons) of interpreted languages, it's are very useful rule in my
opinion.
[...]

But seriously... why not tell them the truth? Python is slower
than some other languages because optimization for execution speed
has not been the primary focus of Python's development. If you
tell them that Python is slow because it is interpreted, they will
believe that Python will always be slow.


I don't think that "Python's developers don't focus primarily on
speed" sounds better than "Python is interpreted". Both suggests
that you must circumvent problems with Python's execution speed (we
all know that this works and how) because you can't count on
language improvements. Even worse, evading "interpreted" may sound
like an euphemism and as if you want to hide unloved implementation
features, depending on your audience.

Tschö,
Torsten.

--
Torsten Bronger, aquisgrana, europa vetus ICQ 264-296-646
Feb 22 '06 #93
Kay Schluehr wrote:
Paul Rubin wrote:
"Kay Schluehr" <ka**********@gmx.net> writes:
I talked to Richard Emslie recently and he told me that the PyPy team
works on a mechanism to create CPython-extension modules written in
RPython i.e. a statically translateable subset of Python.
Sounds great but is that a whole lot different from pyrex?


I've wondered that as well.
RPython is Python code not a different language. In a sense RPython
consists of a set of rules usual Python has to conform to make complete
type-inference feasible. Here is an overview of those rules.

http://codespeak.net/pypy/dist/pypy/...tricted-python


But does that make it "proper Python"? Having, for example, only one
type associated with a name (they use the term "variable", though) at
any given time makes it more like various statically typed or
functional languages, although I can fully understand why you'd want
this restriction.

Paul

Feb 22 '06 #94
Paul Rubin wrote:
"Kay Schluehr" <ka**********@gmx.net> writes:
I talked to Richard Emslie recently and he told me that the PyPy team
works on a mechanism to create CPython-extension modules written in
RPython i.e. a statically translateable subset of Python. So even
without dynamic code specialization there will be an optimization path
based on the PyPy toolchain that is amazing.

Well. "... the PyPy team works on ..." is definitively much too strong.
It is more like "... the PyPy team is thinking about ...". It is very
unclear whether it will work on a technical level and whether the EU
will allow us to allocate resources accordingly.
Sounds great but is that a whole lot different from pyrex?


Indeed, there are similarities to pyrex. Of course in pyrex you have to
give the types yourself, but since the type inference engine of PyPy can
sometimes be hard to understand this is maybe not the worst trade-off. A
nice advantage of the PyPy approach would be that you can test your
RPython code by running it on top of CPython until it works and only
then translating it into C. Plus it would be possible to use the same
extension module for PyPy, CPython and potentially even Stackless or
Jython (if somebody writes a Java backend).

But as I said, this is all pretty unclear at the moment (and getting
really quite off-topic for this thread).

Cheers,

Carl Friedrich

Feb 22 '06 #95
Carl Friedrich Bolz:
Indeed, there are similarities to pyrex. Of course in pyrex you have to
give the types yourself, but since the type inference engine of PyPy can
sometimes be hard to understand this is maybe not the worst trade-off.
A nice advantage of the PyPy approach would be that you can test your
RPython code by running it on top of CPython until it works and only
then translating it into C. [...]
But as I said, this is all pretty unclear at the moment


Maybe PyPy can be used to allow a better linking between interpreted
Python code and code compiled by ShedSkin.
SS contains a good type inferencer that maybe can be useful to compile
RPython too, aren't PyPy people interested in SS and its capabilities?

Bye,
bearophile

Feb 22 '06 #96
I replied to this message yesterday, but it did not appear, so let's
try again.

I agree with your points, but I would not say that Lisp is
intrinsically more dynamic than Python
as a language; it is just more interactive and has more features (and
more complexities too).
BTW, regarding your first point on interactive sessions, are you aware
of Michael Hudson's recipe
"automatically upgrade class instances on reload()"
http://aspn.activestate.com/ASPN/Coo.../Recipe/160164 ?

Michele Simionato

Feb 22 '06 #97
Steven D'Aprano wrote:
Who says Python is so slow? I've just got Python to count from 0 up to
100,000, and it only took 7 milliseconds. That's at least 12 milliseconds
faster than I can count on my fingers.


+1 QOTW

Feb 22 '06 #98
"Michele Simionato" <mi***************@gmail.com> writes:
I replied to this message yesterday, but it did not appear, so let's
try again.

I agree with your points, but I would not say that Lisp is
intrinsically more dynamic than Python as a language;
Neither would I -- I don't think either is obviously more dynamic than the
other.

But since it has been implied that python's comparatively poor performance is
simply due to it being more dynamic than other languages, I wanted to point
out that one could with just as much justification claim CL to be more dynamic
than python (it is in some regards, but not in others -- how to weight them to
achieve some overall "score" is not obvious. I really doubt that python will
ever come remotely close to the level of dynamism that now defunct lispmachine
technology achieved, though).
it is just more interactive and has more features (and more complexities
too).
Indeed -- CL is much more complex than python and has many, many more warts.

As for common lisp being "just more interactive" -- all things being equal I
fail to see how "more interactive" cannot imply more dynamic -- IMO it doesn't
get much more dynamic than changing and inspecting things interactively. Also
not all the ways in which CL is more dynamic represent features that increase
complexity. For example in CL you could just write

def foo(x, l=[], N=len(l)): [...]

and have it work as expected because defaults are evaluated on call (this is
one of the very rare occassions of an obvious design wart in python, IMO).

In other cases, of course, more dynamism seems to involve added complexity.
For example CL has sane (dynamically scoped) global variables (and local ones,
if you declare them "special"). I think python is somewhat broken in this
regard, but I must admit I have no idea how to implement dynamically scoped
variables simply and "pythonically", so I wouldn't call it an obvious design
flaw.
BTW, regarding your first point on interactive sessions, are you aware
of Michael Hudson's recipe
"automatically upgrade class instances on reload()"
http://aspn.activestate.com/ASPN/Coo.../Recipe/160164 ?


Thanks, it's nice to be aware of other solutions, I'll have a closer look at
some point. I've of course also written my own code for that purpose -- apart
from ipython.el and a couple of private utilities I even got sufficiently
pissed off by bugs introduced by python's poor support for serious interactive
work that I started writing some sort of versioned module system that kept
track of what was being imported from where to where, but I ran out of time
and ultimately for this and unrelated reasons switched to matlab in this
particular case.

Matlab sucks in countless ways, but it gives a superior interactive
environment. If you do experimental work where state is expensive to recompute
from scratch but where you need to tune various parameters to obtain the
desired results, problems introduced by changes not properly propagating are
very, very irritating -- especially if you want to keep a record of what changes
effected what, so that your experiments are repeatable.

'as
Feb 22 '06 #99
Carl Friedrich Bolz wrote:
Paul Rubin wrote:
Well. "... the PyPy team works on ..." is definitively much too strong.
It is more like "... the PyPy team is thinking about ...". It is very
unclear whether it will work on a technical level and whether the EU
will allow us to allocate resources accordingly.


In this thread
http://groups.google.com/group/comp....32f0d48c9e7be9
Christian Tismer said that he would work on this, even if this is not
pursued by the Pypy team, because he has personal reasons for doing it.

So I'm confident that sooner or later, it will be possible to create
extension modules written in Rpython.

Feb 22 '06 #100

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

699
by: mike420 | last post by:
I think everyone who used Python will agree that its syntax is the best thing going for it. It is very readable and easy for everyone to learn. But, Python does not a have very good macro...
303
by: mike420 | last post by:
In the context of LATEX, some Pythonista asked what the big successes of Lisp were. I think there were at least three *big* successes. a. orbitz.com web site uses Lisp for algorithms, etc. b....
6
by: Simo Melenius | last post by:
Hi, I'm wondering (after a bit of googling) whether there exists a Python binding to any open source Lisp environment (like librep or some Scheme or Common Lisp implementation) that could be...
0
by: Simo Melenius | last post by:
I'm posting a self-followup to my post in last December about Python and Lisp integration: <URL:http://groups-beta.google.com/group/comp.lang.python/msg/ff6345845045fb47?hl=en> Now, just...
37
by: seberino | last post by:
I've been reading the beloved Paul Graham's "Hackers and Painters". He claims he developed a web app at light speed using Lisp and lots of macros. It got me curious if Lisp is inherently faster...
12
by: Tolga | last post by:
Hi everyone, I am using Common Lisp for a while and nowadays I've heard so much about Python that finally I've decided to give it a try becuase Python is not very far away from Lisp family. I...
267
by: Xah Lee | last post by:
Python, Lambda, and Guido van Rossum Xah Lee, 2006-05-05 In this post, i'd like to deconstruct one of Guido's recent blog about lambda in Python. In Guido's blog written in 2006-02-10 at...
21
by: Alok | last post by:
While posting a comment on http://www.reddit.com I got an error page with the following curious statement on it. "reddit broke (sorry)" "looks like we shouldn't have stopped using lisp..." ...
852
by: Mark Tarver | last post by:
How do you compare Python to Lisp? What specific advantages do you think that one has over the other? Note I'm not a Python person and I have no axes to grind here. This is just a question for...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.