By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,740 Members | 844 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,740 IT Pros & Developers. It's quick & easy.

Python's biggest compromises

P: n/a
I have been reading a book about the evolution of the Basic
programming language. The author states that Basic - particularly
Microsoft's version is full of compromises which crept in along the
language's 30+ year evolution.

What to you think python largest compromises are?

The three that come to my mind are significant whitespace, dynamic
typing, and that it is interpreted - not compiled. These three put
python under fire and cause some large projects to move off python or
relegate it to prototyping.

Whitespace is an esthetic preference that make Andrew Hunt and David
Thomas (of Pragmatic Programmer fame) prefer Ruby. Personally, I love
it - but I can see why some people might not like it (30 years of
braces).

Dynamic typing causes the most fuss. I like Guido's answer to the
question -
"Doesn't dynamic typing cause more errors to creep into the code because you catch them later than compile time?". "No, we use Unit Testing in Zope".


That said, obvious Basic compromised by using things such as "Option
Explicit", thereby allowing both dynamic and more static style
variables. Yahoo groups moved from python to C due to dynamic typing.

Non-compiled - obviously there are times when performance matters more
than other things. Google I believe uses python to prototype (or used)
and then turns to c++ for heavy lifting.

What about immutable strings? I'm not sure I understand Guido's
preference for them.

Anthony
http://xminc.com/anthony
Jul 18 '05 #1
Share this Question
Share on Google+
65 Replies


P: n/a
On 31 Jul 2003 06:55:52 -0700, an************@hotmail.com
(Anthony_Barker) wrote:
I have been reading a book about the evolution of the Basic
programming language. The author states that Basic - particularly
Microsoft's version is full of compromises which crept in along the
language's 30+ year evolution.

What to you think python largest compromises are?

[snip whitespace, dynamic typing, interpreted]

I don't see those as compromises, but mostly as assets.

Significant whitespace (you probably mean significant indentation -
whitespace isn't more or less significant in Python than in other
modern languages) I have only experienced as a boost in readability,
clarity and, most of all, consistence; and there's no possibility of
'brace style wars'.

Dynamic typing vs. static typing has already long ago reached the
status of holy war, so I'll decline to comment.

That python is not (yet) compiled, is mostly a non-issue (and if PyPy
is a success, it won't even be that). If it was just about
performance, then coding the really performance-intensive parts in C
should suffice, apart from kernel hacking and similar . In my
experience, the decision to convert a (successfully functioning)
project from 'a scripting language' to C/C++/Java has always been a
political one, and not really based on technical considerations.

That said, the only large compromise in Python language design I can
detect, is the decision to be quite strictly backwards-compatible
between versions, which is definitely not a bad thing, as long as the
language doesn't go baroque because of it. And Python 3.0 will
hopefully throw out any accumulated cruft.
--Christopher
Jul 18 '05 #2

P: n/a
> What to you think python largest compromises are?

There aren't any.

You want to base significant projects on the highest level, most dynamic
tools available (Python), then use Python as a wrapper to hide static
inflexibilities and inferiorities when descending to lower levels for
whatever (usually spurious) reasons. For example, there is a huge
difference between using wxWindows and wxPython, but the performance
difference between wxWindows and wxPython is insignificant.

Edward
--------------------------------------------------------------------
Edward K. Ream email: ed*******@charter.net
Leo: Literate Editor with Outlines
Leo: http://webpages.charter.net/edreamleo/front.html
--------------------------------------------------------------------
Jul 18 '05 #3

P: n/a
Anthony_Barker wrote:
(snip)

What to you think python largest compromises are?

The three that come to my mind are significant whitespace, dynamic
typing, and that it is interpreted - not compiled.


IMHO these are not compromises, but features.

Bruno

Jul 18 '05 #4

P: n/a
In article <vi************@news.supernews.com>, John Roth
<ne********@jhrothjr.com> writes
......High performance isn't Python's target. If PyPy ever gets their act
off the ground, then we will have a shot at a good quality JIT
interpreter. Then watch it fly.
..... doesn't psyco already attempt to do JIT? It certainly doesn't
speed things up that much. If all the variables are known to be of a
specific type then you could expect C like speeds from a good JIT, but
without global analysis it's hard to see how we infer/guarantee python
types.
John Roth


Anthony
http://xminc.com/anthony



--
Robin Becker
Jul 18 '05 #5

P: n/a
"John Roth" <ne********@jhrothjr.com> writes:
[...]
That said, I've come to the conclusion that the editor should take
care of these things for you. If you prefer a brace free notation,
you should be able to tell your editor to present the program to you
that way. If you prefer braces, then it should be able do that for
you as well. That kind of stylistic thing doesn't belong in the
language.
100% agreed: once-and-only-once dictates this. Manually maintaining
both braces and indentation (as in C, with some editors) is Just Plain
Bad for this reason.

In fact, if I didn't have to deal with the braces, I think I'd come
around to the view that the text should have them. Explicit is
better than implicit,
At least in the absence of proper research results, I think this part
of it *is* religious. To some people, it's obvious that whitespace is
better on the eyes. To others, it's obvious that whitespace + braces
is better on the eyes. One group may well be wrong <0.5 wink>.

This argument (that braces are another visual cue which make code
easier to read -- provided that they're automatically maintained in
sync with the indentation), is the only argument against
pure-whitespace that has ever made any sense to me. Irrespective of
whether you pick whitespace or braces as the thing the compiler takes
notice of, though, the optimal solution probably still involves an
editor that supports showing/hiding braces, so the syntax is
(theoretically!) a non-issue. In practice, editors don't seem to
currently support showing and hiding braces automatically. At least
editors do get you out of most mistakes caused by brace-y languages.

Of course, whitespace *is* still superior because the storage format
doesn't duplicate state -- so people with poor editors can't produce
ambiguous code :-) (remember code can be ambiguous to *people* even if
it isn't to a compiler). OTOH, *Python's* scheme is inferior to a
pure-space-character indentation scheme because off the tab-vs.-space
issue :-(

and there are a few things that the
automatic indentation makes rather difficult in the design area.

[...]

What are those things??
John
Jul 18 '05 #6

P: n/a
In article <89*************************@posting.google.com> ,
Anthony_Barker wrote:
I have been reading a book about the evolution of the Basic
programming language. The author states that Basic - particularly
Microsoft's version is full of compromises which crept in along the
language's 30+ year evolution.

What to you think python largest compromises are?

The three that come to my mind are significant whitespace, dynamic
typing, and that it is interpreted - not compiled. These three put
python under fire and cause some large projects to move off python or
relegate it to prototyping.
I don't view any of these as "compromises". That word suggests that
something was conceded, or that an intermediate position between two
extremes was chosen to appease. I don't think that either sense really
applies to these features.

The three items that you listed are merely design choices. While arguments
over them are continuous, two of the design choices (interpreter, dynamic
typing) are consistent with Python's intended use as a language which
excels at rapid prototyping. The third (white space) is merely a stylistic
choice which is designed to encourage readable programs.

"Compromises" in language design occur usually when a committee tries to
standardize a language, and each has differing views about how the language
should be used. While this occurs somewhat in Python, other languages
have suffered more mightily from this particular disorder.

Mark
Anthony
http://xminc.com/anthony

Jul 18 '05 #7

P: n/a
It seems to me that a big compromise/feature is that all kinds of
namespaces are usually represented by dictionaries, and that Python
exposes this fact to the programmer. This would seem to limit the
possible optimizations that can easily be performed by a compiler.

BTW, I have only read about Python out of interest, and haven't
actually used it for anything, so I hope my remark isn't ignorant.

Best regards,
Aaron
Jul 18 '05 #8

P: n/a

"Anthony_Barker" <an************@hotmail.com> wrote in message
news:89*************************@posting.google.co m...
What to you think python largest compromises are?
A compromise is an in-between position or decision. Example: wife
wants to go to a horse show, husband to an auto race, so they
compromise and go to a horse race.
The three that come to my mind are significant whitespace, dynamic
typing, and that it is interpreted - not compiled.
The first two are end-point positions, not in-between compromises.
The third is a matter of definition and implementation. CPython
compiles to version-dependent but otherwise portable PyCode. PyRex,
Weave, and Psyco all compile to C or machine code.
These three put python under fire
Anything can bo put under fire by anyone who wants to shoot.
and cause some large projects to move off python or
This sort of statement remains an opinion or impression until backed
by evidence.
relegate it to prototyping.
This is one of its intended uses.
Whitespace is an esthetic preference that make Andrew Hunt and David
Thomas (of Pragmatic Programmer fame) prefer Ruby.
Evidence? And I mean evidence that whitespace is *the* reason and not
just a convenient summary of an overall esthetic preference.

In any case, so what? Different strokes for different folks. Do they
also use indentation for human readers? If so, they have assigned
themselves the task of keeping brackets and indents in sync so that
human and machine 'see' the same structure.

I see two good uses for brackets:
1. machine-generated code never intended for human eyes
2. redundancy for transmission error detection by a processor that
compares brackets and indents and raises a flag on mismatches.

A compromise in the area of structure indication would be accepting
either brackets or indents or both.
Yahoo groups moved from python to C due to dynamic typing.
Evidence? Evidence as to what exactly happened (it is not common
knowledge that I know of) and that any such change was a reasoned
technical decision and not politics.

If memory serves me right, Guido has tried a couple of compromises to
slightly limit dynamicity that he could not see much use for and has
backed off at least partly when current users presented use cases that
the change would break.
Non-compiled - obviously there are times when performance matters more than other things. Google I believe uses python to prototype (or used) and then turns to c++ for heavy lifting.


This is a simple matter of economic tradeoff. A week of programmer
time costs roughly the same as, say, a year of pc time. A roaring
success like Google has hundreds (thousands?) of servers around the
world running the same relatively stable code. Faster code means
machines not bought, installed, and maintained. But I imagine that
their main production code is pretty far out on the frequency-of-use
curve.

Terry J. Reedy
Jul 18 '05 #9

P: n/a

Anthony> What to you think python largest compromises are?

Anthony> The three that come to my mind are significant whitespace,
Anthony> dynamic typing, and that it is interpreted - not compiled.
Anthony> These three put python under fire and cause some large projects
Anthony> to move off python or relegate it to prototyping.

Your message is sure to get the pot boiling. I don't think of any of the
above as compromises. They were all design decisions. Considering the
whitespace issue, calling it a compromise suggests that Guido had to cave in
to some outside forces. He couldn't decide between BEGIN/END or {/} as
block delimiters, so he chose significant whitespace. It doesn't make
sense.

Anthony> What about immutable strings? I'm not sure I understand Guido's
Anthony> preference for them.

Performance is one reason. The Python interpreter creates a huge number of
strings at runtime. Knowing exactly how long the string is going to be and
that it will not grow means that a single malloc can be used to allocate the
object header and the storage for the data. If strings were mutable, the
structure of the string object storage would probably be much different and
you'd need at minimum two mallocs per string, one for the object header and
one for the data itself.

Skip
Jul 18 '05 #10

P: n/a
On Thu, 2003-07-31 at 08:55, Anthony_Barker wrote:
The three that come to my mind are significant whitespace, dynamic
typing, and that it is interpreted - not compiled. These three put
python under fire and cause some large projects to move off python or
relegate it to prototyping.


I think these are valid as "compromises", not to be confused with
flaws. These are all explicit choices, and ones for which the original
justifications remain valid. A lot of stuff in Basic is simply flaws,
or based on justifications that no longer apply to today's programming
world.
Anyway, I might add mine: the nature of modules as executed code is a
compromise. That is, a module isn't a declaration, it's a program to be
executed in its own namespace. When you import a module, you are
executing the module then looking at the namespace.

There are some advantages to this, particularly in the transparency of
the implementation -- things don't always work the way you might want
(e.g., circular imports), but it's usually not that hard to understand
why (and often the way you want things to work has nasty gotchas that
you wouldn't have thought of). It also opens up a lot of possibilities
for dynamicism in class and function declaration, like doing this in the
top level:

if something:
def func(x): ...
else:
def func(x): ...

But it's a compromise, because it makes things more difficult as well.
It's a *big* problem in any long-running process, where you may want to
modify code underlying the system without rebuilding the entire state.
Classes aren't declared, they are simply constructed, so by reloading a
module all the persistent instances still exist and refer to the defunct
class. You can modify classes at runtime, but this is different from
simply rerunning the class definition. (A clever metaclass *could* make
those equivalent, though... hmmm...)

A related problem is that Python objects generally can accept any
instance variable names, and names are not declared anywhere. Again,
this makes it difficult to deal with long-lived objects. If you change
the class so that all instances get a new attribute, old objects won't
be updated. I'm thinking about both of these things in terms of
Smalltalk, where they make tools possible that really add to the ease of
developing in its environment.

Not that I don't like the fun tricks Python lets you do. Prototype-like
programming (as in Self) is very accessible in Python, and classes are
only a suggestion not a dominant concept. So, it's a compromise.

There are lots and lots of compromises in Python -- every aspect has
pluses and minuses to it. Personally I like whitespace sensitivity well
enough, but in the larger sense I think it probably was the wrong choice
-- but that's based on how I weigh various benefits and problems, and
other people will validly weigh them differently.

Ian

Jul 18 '05 #11

P: n/a

"Robin Becker" <ro***@jessikat.fsnet.co.uk> wrote in message
news:TB**************@jessikat.fsnet.co.uk...
In article <vi************@news.supernews.com>, John Roth
<ne********@jhrothjr.com> writes
.....
High performance isn't Python's target. If PyPy ever gets their act
off the ground, then we will have a shot at a good quality JIT
interpreter. Then watch it fly.

.... doesn't psyco already attempt to do JIT? It certainly doesn't
speed things up that much. If all the variables are known to be of a
specific type then you could expect C like speeds from a good JIT, but
without global analysis it's hard to see how we infer/guarantee python
types.


Well, that's certainly a problem, but Bicycle Repair Man seems to
do a pretty good job of type inference, at least for refactoring.

One of the things to consider here is that a decent JIT interpreter
would automatically change the playing field for what is good
practice and what isn't, at least if you expect performance.

John Roth --
Robin Becker

Jul 18 '05 #12

P: n/a
On Thu, 2003-07-31 at 16:27, John Roth wrote:
"Robin Becker" <ro***@jessikat.fsnet.co.uk> wrote in message
news:TB**************@jessikat.fsnet.co.uk...
In article <vi************@news.supernews.com>, John Roth
<ne********@jhrothjr.com> writes

.....
High performance isn't Python's target. If PyPy ever gets their act
off the ground, then we will have a shot at a good quality JIT
interpreter. Then watch it fly.

.... doesn't psyco already attempt to do JIT? It certainly doesn't
speed things up that much. If all the variables are known to be of a
specific type then you could expect C like speeds from a good JIT, but
without global analysis it's hard to see how we infer/guarantee python
types.


Well, that's certainly a problem, but Bicycle Repair Man seems to
do a pretty good job of type inference, at least for refactoring.


And Java's JIT is based on (at least originally) work done on Self,
which had to do type inference. And actually in many circumstances Java
requires type inference, because you can substitute in an instance of a
subclass.

Anyway, JIT is all about runtime analysis -- if you could infer types
completely before running the program, you would just put in the
optimizations statically (i.e., compiling optimizations). JIT does
those optimizations at runtime by definition.

And Bicycle Repair Man is inspired by the Refactoring Browser, an IDE
tool based on another dynamic language (Smalltalk), not on a tool from a
static language (like Java).

Ian

Jul 18 '05 #13

P: n/a
an************@hotmail.com (Anthony_Barker) wrote in message news:<89*************************@posting.google.c om>...
What to you think python largest compromises are?


I think reference counting is. Added to this the fact that
garbage collection is also possible (as in Jython). So we get
the worst of both worlds.

1. Idioms like this won't work portably:

def foo():
f = file('x')
return f.read()

Even though f.__del__ closes the file, in garbage
collected environment the f object might not get deleted
in a good while, and would cause problems.

2. And then, we still need to use weakref to ensure
that our crossreferenced stuff works both with and without GC.

Worst of both indeed. Maybe the decision to choose reference
counting was driven by speed considerations. That might've
been reasonable back in early 90's, but GC techniques have
evolved from those days and so GC would be a superior
technique now.

Well, since no one else pointed this out yet, maybe there's
some flaw in my reasoning.
Jul 18 '05 #14

P: n/a
Dennis Lee Bieber:
Whereas BASIC started life as an interpreted language wherein every statement had a line number, conditionals (IF) did jumps to line
numbers, and all variables were global (even across subroutine calls).


Not interpreted.
See http://wombat.doc.ic.ac.uk/foldoc/fo...artmouth+BASIC
] Dartmouth BASIC
] <language> The original BASIC language [...] Unlike most later
] BASIC dialects, Dartmouth BASIC was compiled

or a more detailed description at http://www.kbasic.org/1/history.php3
which says the first interpreted BASIC was the Micro-Soft one for
the Altair.

In more seriousness, compiled v. interpreted is an implementation
detail which has very little impact on the language. The other examples
you gave are more relevant.

One thing to consider is - what's the *compromise* in the different
versions of BASIC? That is, how was the support for backwards-
compatible operations in BASIC (which you list as a compromise)
any different than Python's backwards compatibility support?

Andrew
da***@dalkescientific.com
Jul 18 '05 #15

P: n/a
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Anthony_Barker wrote:
| I have been reading a book about the evolution of the Basic
| programming language. The author states that Basic - particularly
| Microsoft's version is full of compromises which crept in along the
| language's 30+ year evolution.
|
| What to you think python largest compromises are?
|
IHMO it is the lambda expression.
These "functions" are not real functions. You can not use
statements in them.

What Python realy needs here is some means to make an expression
from a list of statements (called suite in the syntax defintion).
That given PEP 308 (If-then-else expression) or PEP 318 (Function/Method
Decorator Syntax) are mostly pointless.

Let me give an example:

Suppose there some special braces like (: :) and a exit operator like a
unary ^ one can write a conditional expession like that:
(:
~ if f():
~ ^trueValue
~ else:
~ ^falseValue :)

classmethods can be declared as follows:
cm = (:
~ def im( arg0, arg1 ):
~ return answer
~ ^classmethod( im ) :)

or
cm = classmethod( (:
~ def im( arg0, arg1 ):
~ return answer
~ ^im
~ :) )

obvously this demands for some means to write anonymous functions like

def ( arg0, arg1 ):
~ return arg0 - arg1

semanticly this should transform to

(:
~ def newName( arg0, arg1 ):
~ return arg0 - arg1
~ ^newName :)

giving

cm = def ( arg0, arg1 ):
~ return answer

Ok, I admit that this is difficult to be integrated in
the existing syntax. Perhaps we can not drop the braces
around such expression.

Is this worth writing a PEP?

|
|>"No, we use Unit Testing in Zope".
I am still missing a simple testing framework. doctest
is a good idea, but conflicts with syntax hilighting in most
editors.

|
|
| That said, obvious Basic compromised by using things such as "Option
| Explicit", thereby allowing both dynamic and more static style
| variables. Yahoo groups moved from python to C due to dynamic typing.
This is not a problem. They key to speed is using extension written
in C for performance. Normaly you will find one that sloves your problem.

|
| Non-compiled - obviously there are times when performance matters more
| than other things. Google I believe uses python to prototype (or used)
| and then turns to c++ for heavy lifting.
|
| What about immutable strings? I'm not sure I understand Guido's
| preference for them.
In fact the array module provides mutable strings.

|
| Anthony
| http://xminc.com/anthony

HTH,
Gerald
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)
Comment: Using GnuPG with Debian - http://enigmail.mozdev.org

iD8DBQE/KjWgEDg9cqFA1jQRAovzAJkBygPzdfHsoVXu9H2QHnxTD9sMmA CdHvY5
dB1j4kXTzejml7fG0oSwhUg=
=WGR+
-----END PGP SIGNATURE-----

Jul 18 '05 #16

P: n/a
In article <ma**********************************@python.org >, Ian
Bicking <ia**@colorstudy.com> writes
And Java's JIT is based on (at least originally) work done on Self,
which had to do type inference. And actually in many circumstances Java
requires type inference, because you can substitute in an instance of a
subclass.

Anyway, JIT is all about runtime analysis -- if you could infer types
completely before running the program, you would just put in the
optimizations statically (i.e., compiling optimizations). JIT does
those optimizations at runtime by definition.

but Java does at least require specifying every type and that must at
least cut down on the amount of work required.
And Bicycle Repair Man is inspired by the Refactoring Browser, an IDE
tool based on another dynamic language (Smalltalk), not on a tool from a
static language (like Java).

Ian


I don't have any data here, but I believe Python is just a little too
weakly typed for compiling to float*float type assembler efficiently.
--
Robin Becker
Jul 18 '05 #17

P: n/a

"Robin Becker" <ro***@jessikat.fsnet.co.uk> wrote in message
news:23**************@jessikat.fsnet.co.uk...
In article <ma**********************************@python.org >, Ian
Bicking <ia**@colorstudy.com> writes
And Java's JIT is based on (at least originally) work done on Self,
which had to do type inference. And actually in many circumstances Java
requires type inference, because you can substitute in an instance of a
subclass.

Anyway, JIT is all about runtime analysis -- if you could infer types
completely before running the program, you would just put in the
optimizations statically (i.e., compiling optimizations). JIT does
those optimizations at runtime by definition.

but Java does at least require specifying every type and that must at
least cut down on the amount of work required.
And Bicycle Repair Man is inspired by the Refactoring Browser, an IDE
tool based on another dynamic language (Smalltalk), not on a tool from a
static language (like Java).

Ian


I don't have any data here, but I believe Python is just a little too
weakly typed for compiling to float*float type assembler efficiently.


The trick with JITs is that they don't depend on absolute type
consistency. They depend on the observation that 99.44% of your
code is type consistent, and that consistency will turn up at run time. So
the code they generate depends on that discovered consistency, and
checks in front of each section to discover if the types are what the
code expects.

If it is, they execute it, if it isn't, they abandon it and go back to
the intepreter to discover what happened.

John Roth --
Robin Becker

Jul 18 '05 #18

P: n/a
ha******@yahoo.com.au (Hannu Kankaanpää) writes:
Worst of both indeed. Maybe the decision to choose reference
counting was driven by speed considerations.
Ease of implementation, portability and playing nicely with C
extensions are more likely candidates, IMO.
That might've been reasonable back in early 90's, but GC techniques
have evolved from those days and so GC would be a superior technique
now.


<button nature="hot">
Reference counting *is* a form of garbage collection.
</button>

Saying "Ref. counting sucks, let's use GC instead" is a statement near
as dammit to meaningless.

Given the desires above, I really cannot think of a clearly better GC
strategy for Python that the one currently employed. AFAICS, the
current scheme's biggest drawback is its memory overhead, followed by
the cache-trashing tendencies of decrefs.

What would you use instead?

Cheers,
mwh

--
After a heavy night I travelled on, my face toward home - the comma
being by no means guaranteed. -- paraphrased from cam.misc
Jul 18 '05 #19

P: n/a

"Hannu Kankaanpää" <ha******@yahoo.com.au> wrote in message
news:84**************************@posting.google.c om...
an************@hotmail.com (Anthony_Barker) wrote in message

news:<89*************************@posting.google.c om>...
What to you think python largest compromises are?


I think reference counting is. Added to this the fact that
garbage collection is also possible (as in Jython). So we get
the worst of both worlds.

1. Idioms like this won't work portably:

def foo():
f = file('x')
return f.read()

Even though f.__del__ closes the file, in garbage
collected environment the f object might not get deleted
in a good while, and would cause problems.

2. And then, we still need to use weakref to ensure
that our crossreferenced stuff works both with and without GC.

Worst of both indeed. Maybe the decision to choose reference
counting was driven by speed considerations. That might've
been reasonable back in early 90's, but GC techniques have
evolved from those days and so GC would be a superior
technique now.

Well, since no one else pointed this out yet, maybe there's
some flaw in my reasoning.


There's a flaw in your reasoning. The various techniques that
descend from mark and sweep (which is what you're
calling garbage collection) depend on being able to
identify all of the objects pointed to. For objects that are
owned by Python, that's a lengthy (that is, inefficient)
process, and it's not possible in general for objects that
are created by extensions.

Reference counting only depends on having the
object itself, and control of the creation and removal
of references. The latter is a frequent source of bugs
and memory leaks in extensions.

It's easy to say that various languages would be improved
by adding "real" garbage collection, but those techniques
impose significant design constraints on the implementation
model.

John Roth
Jul 18 '05 #20

P: n/a
Michael Hudson wrote:
<button nature="hot">
Reference counting *is* a form of garbage collection.
</button>

Saying "Ref. counting sucks, let's use GC instead" is a statement near
as dammit to meaningless.
This statement is not meaningless because most programmers will correctly
identify GC in this context as something like mark-and-sweep, generation
scavenging etc.

see also: DOS sucks, let's use an operating system instead.
current scheme's biggest drawback is its memory overhead, followed by
the cache-trashing tendencies of decrefs.


plus it doesn't support independent threads as all reference counting would
have to be protected, leading to poor performance.

But a lot of Python code depends on reference counting or more exactly it
depends on the timely call of the destructor. So even if a much better GC is
added to Python, reference counting would perhaps be kept for backwards
compatibility (see Python's biggest compromises)

Daniel

Jul 18 '05 #21

P: n/a
John Roth wrote:
There's a flaw in your reasoning. The various techniques that
descend from mark and sweep (which is what you're
calling garbage collection) depend on being able to
identify all of the objects pointed to. For objects that are
owned by Python, that's a lengthy (that is, inefficient)
That's what generation scavenging was developed for. One shouldn't argue by
tradition alone, but the fact that the major implementations of dynamic
languages like LISP and Smalltalk don't use reference counting should carry
some weight.
process, and it's not possible in general for objects that
are created by extensions.
This is generally handled by registering and unregistering objects in the
extension code. Error prone as well, but probably less so than reference
counting.
It's easy to say that various languages would be improved
by adding "real" garbage collection, but those techniques
impose significant design constraints on the implementation
model.


True. But one could review these constraints from time to time.

Daniel

Jul 18 '05 #22

P: n/a
Michael Hudson <mw*@python.net> writes:
One shouldn't argue by tradition alone, but the fact that the major
implementations of dynamic languages like LISP and Smalltalk don't
use reference counting should carry some weight.


True. But the major implementations of these languages are also
usually less portable, and something more of a fiddle to write C
extensions for (at least, for the implementations I know about, which
are mostly CL impls).


I'd say the opposite, the Lisp implementations I've worked on are
considerably easier to write C extensions for, partly BECAUSE you
don't have to worry about constantly tweaking ref counts. In GNU
Emacs Lisp, for example, if you cons a new heap object and put it in a
C variable and (iirc) then call eval, you have to call a macro that
tells the GC not to sweep the object. But many C functions don't make
new objects, and most don't call eval, and you don't have to remember
what objects you've called the macro for. There's another macro that
you call before your function returns, and that cleans up all the GC
records in your stack frame made by any invocations of the first macro.
Jul 18 '05 #23

P: n/a
In article <vi************@news.supernews.com>, John Roth
<ne********@jhrothjr.com> writes

I don't have any data here, but I believe Python is just a little too
weakly typed for compiling to float*float type assembler efficiently.


The trick with JITs is that they don't depend on absolute type
consistency. They depend on the observation that 99.44% of your
code is type consistent, and that consistency will turn up at run time. So
the code they generate depends on that discovered consistency, and
checks in front of each section to discover if the types are what the
code expects.

If it is, they execute it, if it isn't, they abandon it and go back to
the intepreter to discover what happened.

John Roth

Yes I suspected they have to do that, but that implies that a discovered
'float' object must carry along a whole lot of baggage (I guess I mean
be a more generic object) to allow for the testing. Loops without method
or function calls would be good candidates for JIT as methods and
functions could alter attribute types.

Is the JIT object literally just a union of

type,values

or would it be an actual Python object? For example would an
innerproduct be over a pair of lists or would the magic convert these
into actual double arrays.
--
Robin Becker
Jul 18 '05 #24

P: n/a
> > What to you think python largest compromises are?

The three that come to my mind are significant whitespace, dynamic
typing, and that it is interpreted - not compiled. These three put
python under fire and cause some large projects to move off python or
relegate it to prototyping.


I don't view any of these as "compromises". That word suggests that
something was conceded, or that an intermediate position between two
extremes was chosen to appease. I don't think that either sense really
applies to these features.

The three items that you listed are merely design choices. While arguments
over them are continuous, two of the design choices (interpreter, dynamic
typing) are consistent with Python's intended use as a language which
excels at rapid prototyping. The third (white space) is merely a stylistic
choice which is designed to encourage readable programs.

"Compromises" in language design occur usually when a committee tries to
standardize a language, and each has differing views about how the language
should be used. While this occurs somewhat in Python, other languages
have suffered more mightily from this particular disorder.

Mark


Excellent points - you are correct the ones I listed are design
choices.

Some people could be interpreted them as design "compromises". The
kind of people who would like to use the same tool for all problems.
Jul 18 '05 #25

P: n/a
On Fri, 2003-08-01 at 02:04, Hannu Kankaanpää wrote:
Worst of both indeed. Maybe the decision to choose reference
counting was driven by speed considerations.


Reference counting spreads the speed hit over the entire program, while
other techniques tend to hit performance hard every so often. But all
together I think reference counting is usually slower than a good GC
algorithm, and incremental garbage collection algorithms can avoid
stalling. And I'm sure that the current state -- references counting
plus another kind of garbage collection for circular references -- must
be worse than either alone. The advantage is predictable collection
(unless you are using Jython), without memory leaks (due to circular
references).

Oh well...

Ian

Jul 18 '05 #26

P: n/a
Gerald Klix wrote:
What Python realy needs here is some means to make an expression
from a list of statements (called suite in the syntax defintion).


I believe this is called a "function".

--
--OKB (not okblacke)
"Do not follow where the path may lead. Go, instead, where there is
no path, and leave a trail."
--author unknown
Jul 18 '05 #27

P: n/a

"Robin Becker" <ro***@jessikat.fsnet.co.uk> wrote in message
news:MM**************@jessikat.fsnet.co.uk...
In article <vi************@news.supernews.com>, John Roth
<ne********@jhrothjr.com> writes

I don't have any data here, but I believe Python is just a little too
weakly typed for compiling to float*float type assembler efficiently.
The trick with JITs is that they don't depend on absolute type
consistency. They depend on the observation that 99.44% of your
code is type consistent, and that consistency will turn up at run time. Sothe code they generate depends on that discovered consistency, and
checks in front of each section to discover if the types are what the
code expects.

If it is, they execute it, if it isn't, they abandon it and go back to
the intepreter to discover what happened.

John Roth

Yes I suspected they have to do that, but that implies that a discovered
'float' object must carry along a whole lot of baggage (I guess I mean
be a more generic object) to allow for the testing. Loops without method
or function calls would be good candidates for JIT as methods and
functions could alter attribute types.

Is the JIT object literally just a union of

type,values

or would it be an actual Python object? For example would an
innerproduct be over a pair of lists or would the magic convert these
into actual double arrays.


As far as I'm aware, the JIT code doesn't fiddle with the data;
it just does the equivalent of assert tests at the beginning of the
blocks to verify that it's got the type of object it expects.

In other words, it does a very large amount of run-time type
checking. This only pays off if it can save even more expense
by compiling the code.

Now, this is going to be difficult for short segments of code,
but it can be quite a time saver if the JIT generated code can
make intermediate objects vanish so they don't have to be
created just to be discarded a short time later.

John Roth
--
Robin Becker

Jul 18 '05 #28

P: n/a

"Anthony_Barker" <an************@hotmail.com> wrote in message
news:89*************************@posting.google.co m...
I have been reading a book about the evolution of the Basic
programming language. The author states that Basic - particularly
Microsoft's version is full of compromises which crept in along the
language's 30+ year evolution.

What to you think python largest compromises are?


I'm not sure if we've beaten this one to death or not, but a real
example of a compromise just floated through my head.

Consider <list>.sort() and <list>.reverse(). These two otherwise
admirable methods don't return the object, so they can't be chained.
Why not? Because they update the list in place, and Guido decided
that not returning the object was a cheap way to make it clear that
they were doing something unusual.

Now, *that's* a compromise. The worst of both worlds.

John Roth
Jul 18 '05 #29

P: n/a
Michael Hudson <mw*@python.net> wrote in message news:<7h*************@pc150.maths.bris.ac.uk>...
<button nature="hot">
Reference counting *is* a form of garbage collection.
</button>
You apparently have such a loose definition for garbage
collection, that even C programs have "a form of garbage
collection" on modern OSes: All garbage is reclaimed by
the OS when the program exits. It's just a very lazy collector.

I don't consider something a garbage collector unless it
collects all garbage (ref.counting doesn't) and is a bit more
agile than the one provided by OS.
Saying "Ref. counting sucks, let's use GC instead" is a statement near
as dammit to meaningless.
You, I and everyone knows what I was talking about, so it could
hardly be regarded as "meaningless".
Given the desires above, I really cannot think of a clearly better GC
strategy for Python that the one currently employed. AFAICS, the
current scheme's biggest drawback is its memory overhead, followed by
the cache-trashing tendencies of decrefs.
It's not "the one currently employed". It's the *two* currently
employed and that causes grief as I described in my previous post.
And AFAIK, Ruby does use GC (mark-and-sweep, if you wish) and
seems to be working. However, this is rather iffy knowledge. I'm
actually longing for real GC because I've seen it work well in
Java and C#, and I know that it's being used successfully in many
other languages.
What would you use instead?


A trick question?
Jul 18 '05 #30

P: n/a
In general, I suspect BASIC is more defined by compromises than
Python.

To me there is a compromise in Python's dependence on C. It seems that
at some point I will hit a performance or feature issue that will
require me to write a C extension. It seems to me VB6 has a similarly
awkward relationship with C++. Clearly the creators of Python were
expert C programmers; that should not be a requirement to become an
expert Python programmer.

- Marc
Jul 18 '05 #31

P: n/a

"Marc" <ma********@yahoo.com> wrote in message
news:ac**************************@posting.google.c om...
In general, I suspect BASIC is more defined by compromises than
Python.

To me there is a compromise in Python's dependence on C. It seems that
at some point I will hit a performance or feature issue that will
require me to write a C extension. It seems to me VB6 has a similarly
awkward relationship with C++. Clearly the creators of Python were
expert C programmers; that should not be a requirement to become an
expert Python programmer.
As a number of people have said: if PyPy ever gets working...

John Roth
- Marc

Jul 18 '05 #32

P: n/a
Paul Rubin:
I'd say the opposite, the Lisp implementations I've worked on are
considerably easier to write C extensions for, partly BECAUSE you
don't have to worry about constantly tweaking ref counts.


I've mentioned in c.l.py before a library I used which can be called
from both C and FORTRAN. The latter doesn't support pointers,
so instead the library has a global instance table, indexed by integers.
The C/FORTRAN code just passes integers around.

In addition, there are dependencies between the objects, which
means that user code object deallocation can only occur in a
certain order.

With CPython it's possible to put a high-level OO interface to
that library, and provide hooks for the ref-counted gc to call
the proper deallocators in the correct order. This is done by
telling the finalizer how to do it and paying careful attention to
order.

The library also has Java bindings. As far as I can tell, it's
impossible to hook into Java's automatic gc. A C-level
gc like Boehm can't ever tell that data is no longer needed,
because the global table keeps a reference to every created
object, and Java's native gc doesn't make the proper
guarantees on finalization order.

Andrew
da***@dalkescientific.com
Jul 18 '05 #33

P: n/a
Gerald Klix:
IHMO it is the lambda expression.
These "functions" are not real functions. You can not use
statements in them.


True. That's why they are "lambda expressions" and not
"lambda functions"

(Okay, that was a cheap shot ... but still true ;)

Andrew
da***@dalkescientific.com
Jul 18 '05 #34

P: n/a
Marc:
To me there is a compromise in Python's dependence on C.
Then explain Jython, which is an implementation of Python-the-language
on top of Java.
It seems that
at some point I will hit a performance or feature issue that will
require me to write a C extension.
change "will" to "may"

And if you write in C, at some point you may hit a performance
or feature issue that will require you to write assembly code.
Clearly the creators of Python were
expert C programmers; that should not be a requirement to become an
expert Python programmer.


Lack of knowledge of C does not strongly preclude becoming an
expert Python programmer.

To be an expert Python programmer, you should know how other
programming langauges work too, but that can be done by
learning other languages: Eiffel, Java, APL, Haskel, Caml, Ada,
Prolog, Java, ...

Andrew
da***@dalkescientific.com
Jul 18 '05 #35

P: n/a
The three that come to my mind are significant whitespace, ...


"Significant whitespace" isn't a "compromise," it's a design choice.
The Python interpreter actually inserts explicit scope tokens into
the symbol stream at the lexer; the parser deals with the symbols as
does any parser. It's really not all that hard, actually. One just
has to understand the bit that making the *parser* deal with the white
space is not the right thing.

C//

Jul 18 '05 #36

P: n/a
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

This is not a function. This just the same as blocks in Algol 68,
because both do not defer execution until the fuction is called.

Of course you can define the semantics of such a block in terms
of a function defintion and an immedirate call to the fuction.
This is done in the Scheme languges let syntax.

HTH,
Gerald

OKB (not okblacke) wrote:
| Gerald Klix wrote:
|
|
|>What Python realy needs here is some means to make an expression
|>from a list of statements (called suite in the syntax defintion).
|
|
| I believe this is called a "function".
|

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)
Comment: Using GnuPG with Debian - http://enigmail.mozdev.org

iD8DBQE/K4adEDg9cqFA1jQRAnLkAJ0WXxqAwdzyqrKfOy2O1ycod1aCmQ CfeKRH
doeQdEuwBJ8LS+gy6IOYzZQ=
=KB0f
-----END PGP SIGNATURE-----

Jul 18 '05 #37

P: n/a
> > it isn't to a compiler). OTOH, *Python's* scheme is inferior to a
pure-space-character indentation scheme because off the tab-vs.-space
issue :-(


Well, that's been recognized as a problem for a long time. I believe
that the intention is to mandate spaces only in Python 3.0. On the
other hand, I'm not holding my breath waiting for Guido to decide


What's bad about tabs? I'm new to Python. Tabs are good because then you
can view the source however you want. I can't write in 4 space tabs and
someone else can change it to 2 if they prefer. But I can see the problem
with continuation lines. Also it must be ALL tabs or ALL spaces. This
shouldn't be too hard -- most editors have the "tabify" option.

Andy
Jul 18 '05 #38

P: n/a
> Many tools don't allow you to configure tabs, and of those that do,
each uses its own incompatible syntax and has its own quirks. In other
words, tabs may seem like a good thing if you use just one or two tools
that do what you want, but as soon as your program moves out into
the wild world, things change.


What are these quirks? By far the most common I've seen is mixing tabs and
spaces, but this should be relatively easily solved by requiring one or the
other (minus continuation lines, which are still a problem). Using spaces
has some disadvantages too, since not everyone will use the same number of
spaces, and editors don't behave as nicely. I like when you hit the arrow
key at a tab, and it jumps the full tab, rather than having to press an
arrow key 4 times.
Jul 18 '05 #39

P: n/a

"Andy C" <ay********@cornell.edu> wrote in message
news:y3********************@newssvr21.news.prodigy .com...
What's bad about tabs?


Because there is no 'official' meanings, despite claims to the
contrary, some software like Outlook Express ignores them on receipt.

<flameshield on> tjr

Jul 18 '05 #40

P: n/a

"Andy C" <ay********@cornell.edu> wrote in message
news:mE********************@newssvr21.news.prodigy .com...
Many tools don't allow you to configure tabs, and of those that do,
each uses its own incompatible syntax and has its own quirks. In other
words, tabs may seem like a good thing if you use just one or two tools
that do what you want, but as soon as your program moves out into
the wild world, things change.
What are these quirks? By far the most common I've seen is mixing tabs

and spaces, but this should be relatively easily solved by requiring one or the other (minus continuation lines, which are still a problem). Using spaces
has some disadvantages too, since not everyone will use the same number of
spaces, and editors don't behave as nicely. I like when you hit the arrow
key at a tab, and it jumps the full tab, rather than having to press an
arrow key 4 times.


I'm used to a certain minimum standard from my editors, and smart
tabs are one of those things that's part of the price of admission these
days, just like syntax highlighting. A programming editor's job is to help
where it's useful, and get out of the way when it's not. When you want
to indent, you hit the tab button. It's the editor's job to know I want,
for example, four spaces, and deliver them. In Python, it's the editor's
job to know that when I hit return at the end of a line, there are only one
or two legitimate places to put the cursor on the next line, and to put
it in the most likely of them.

As to the different number of spaces between developers, that's
another thing I'd expect from my editors. It's easy enough in Python
to figure out what's an indent and infer the number of spaces. I'd
expect a decent editor to be able to load a program and tell me
what the indentation policy was! I'd also expect to be able to tell
it to change it, and have it automatically reindent the program for
me.

John Roth
Jul 18 '05 #41

P: n/a

"Terry Reedy" <tj*****@udel.edu> wrote in message
news:8Z********************@comcast.com...

"Andy C" <ay********@cornell.edu> wrote in message
news:y3********************@newssvr21.news.prodigy .com...
What's bad about tabs?
Because there is no 'official' meanings, despite claims to the
contrary, some software like Outlook Express ignores them on receipt.

<flameshield on> tjr


You can always blame that on Micro$haft. However, it does
make e-mailing a program indented with tabs something of an
adventure.

John Roth

Jul 18 '05 #42

P: n/a
On Sat, 02 Aug 2003 23:31:44 GMT, Andy C wrote:
OK, then unless I'm missing something, tabs vs. spaces shouldn't
matter for you. The editor should be able to handle tabs in a
satisfactory manner as well.


Not if spaces *and* tabs are used for indentation within the one file.
Without editor-specific hacks (the "vim:" comment line, the Emacs mode
line, etc.) there's no way to know what size the previous person's
editor had tab stops set to. The standard size of a tab stop is 8, but
many programmers change this in their editors, resulting in indentation
that gets completely messed up when the tabs are shown at 8 columns
against other lines in the same file that use spaces for indentation.

--
\ "The best is the enemy of the good." -- Voltaire |
`\ |
_o__) |
Ben Finney <http://bignose.squidly.org/>
Jul 18 '05 #43

P: n/a
Andy C:
OK, then unless I'm missing something, tabs vs. spaces shouldn't matter for you. The editor should be able to handle tabs in a satisfactory manner as
well.


python-mode for Emacs is an example of how to support different
styles of tab/space uses.

But code goes through other tools. Suppose you only use tabs
and you have your editor set for 2 character tabs. Now print
your code. It'll look different because your printer is likely set
for 8 character tabs.

Some people like writing code like this

if ((this_is_a_value > that_long_variable) or
(this_is_a_value == something_else):
....

that is, align terms of an expression vertically to emphasize
similarities between the lines. Were the second line done in
tabs then differing tab styles would change the alignment,
which ruins the intent. And different tools do have different
tab styles.

Just Use Spaces.

Andrew
da***@dalkescientific.com
Jul 18 '05 #44

P: n/a
Terry Reedy:
Because there is no 'official' meanings, despite claims to the
contrary, some software like Outlook Express ignores them on receipt.


I sent email to a friend of mine. It contained a copy&paste from
the interactive prompt. Something like
class Spam: .... pass


Something on his side (either Exchange or OE, dropped
the "... pass" part of what I sent.

Andrew
da***@dalkescientific.com
Jul 18 '05 #45

P: n/a
Andrew Dalke fed this fish to the penguins on Friday 01 August 2003
01:22 am:

or a more detailed description at http://www.kbasic.org/1/history.php3
which says the first interpreted BASIC was the Micro-Soft one for
the Altair.
I'd swear that the BASIC I learned on -- via a timeshare
Honeywell-Bull using a dial-up from an ASR-33 -- back in 1972 was
interpreted... At the least, we had no seperate compile/link/run
phase... We'd invoke BASIC, enter the source, and type RUN.

My college mainframe (Xerox Sigma 6) also had an interpreted BASIC,
and I can't believe that was created for the machine way down in the
late 70s -- when the hardware dates to the late 60s...

I'll concede that those may have been "compile on <ret>", wherein each
statement was compiled, but statement labels (and branching thereby)
was interpreted (look up in a table to get a pointer to the actual
code).
One thing to consider is - what's the *compromise* in the different
versions of BASIC? That is, how was the support for backwards-
compatible operations in BASIC (which you list as a compromise)
any different than Python's backwards compatibility support?
Python's compatibility doesn't /feel/ like a different language;
through all the changes in Python, a Python 1.3 program /looks/ like a
Python 2.2+ program.

But between early 70s BASIC and what passes for BASIC today looks like
totally different languages. Not even FORTRAN underwent that great a
/visual/ change between (say) FORTRAN IV (aka FORTRAN 66) and FORTRAN
77 (essentially adding block IF constructs) and then Fortran 90 -- yes,
the packaging of modules, dynamic allocation, and new line continuation
do make it a different language... but it /still/ LOOKS like FORTRAN on
quick glance... Let a non-programmer look at source files from K&K
type BASIC, Visual BASIC, F-IV, F77, and F90... and he likely will be
able to identify the three Fortrans as being related -- but would not
consider K&K to be a relation of VB.
Then again, I consider Java (and now .NET) to be nothing more than a
reinvention of the UCSD P-Code Pascal system...
Andrew
da***@dalkescientific.com


-- ================================================== ============ <
wl*****@ix.netcom.com | Wulfraed Dennis Lee Bieber KD6MOG <
wu******@dm.net | Bestiaria Support Staff <
================================================== ============ <
Bestiaria Home Page: http://www.beastie.dm.net/ <
Home Page: http://www.dm.net/~wulfraed/ <


Jul 18 '05 #46

P: n/a
Daniel Dittmar fed this fish to the penguins on Friday 01 August 2003
06:09 am:


see also: DOS sucks, let's use an operating system instead.
WHICH "DOS"... I have source code listings for something called
"K2FDOS"... Along with the source code volumes for LS-DOS. <G>

Now, if you mean MS-DOS... YES. LS-DOS (later licensed as TRSDOS 6)
had features that barely made it into MS-DOS v2. I distinctly recall
laughing in surprised shock when Dr. Dobbs boasted about MS-DOS having
"user installable device drivers" (and what later came to be called
TSRs); LS-DOS 5! had those features at least a year earlier, on an
8-bit OS. It also had a "compiled" JCL, ability to not only redirect
I/O but link I/O -- letting one use a serial port as in/out while still
using the keyboard/display for the /same/ channels, ability for a Job
Log -- tracking commands and errors messages, and a 7-level password
protection scheme (originally every file had two passwords, user and
owner, and the user password was linked to a privilege level --even
knowing the user password might only give you execute-only access to a
file. When updated to handle dates outside 1980-1987, the user password
was dropped -- effectively given all files a "blank" user password. The
owner password was needed to assign a user privilege level to the file)
[no fear of the secretary deleting the word processor executable if it
were set to execute-only].

-- ================================================== ============ <
wl*****@ix.netcom.com | Wulfraed Dennis Lee Bieber KD6MOG <
wu******@dm.net | Bestiaria Support Staff <
================================================== ============ <
Bestiaria Home Page: http://www.beastie.dm.net/ <
Home Page: http://www.dm.net/~wulfraed/ <


Jul 18 '05 #47

P: n/a
Dennis Lee Bieber:
I'd swear that the BASIC I learned on -- via a timeshare
Honeywell-Bull using a dial-up from an ASR-33 -- back in 1972 was
interpreted...
Well, my background is all microcomputer basics in the 80s, including
some (very little) HP BASIC for lab equipment, so I'm not a good
judge. Were I to implement a compiled BASIC I would have done
the "compile on <ret>" you suggested. One of the links described the
Dartmouth BASIC commands and they would be easy to implement
that way.
Python's compatibility doesn't /feel/ like a different language;
through all the changes in Python, a Python 1.3 program /looks/ like a
Python 2.2+ program.
In playing around with some of the new Python 2.3 features, I noticed
myself using a more functional style, like

d = {"A": 1, "B": 9", ...} # all uppercase keys
d.update(dict([(k.lower(), v) for k, v in d.items()]))

instead of the more readable and faster

for k, v in d.items():
d[k.lower] = v

It's easier in Python 2.2+ to make Python code which doesn't
look like Python 1.3. Which I think is the version I started with. ;)

When I got QuickBasic in 1988 or so, it would run just about
all of my old GW-BASIC programs, which was distributed some
5 years previous. But it's true, the GW-BASIC program wouldn't
look like a QB program.

Thinking some more ... the 1.3 code would have string
exceptions, so that's one age indicator. It would use the
"while 1 / readline / if not line / break" idiom instead of the
more modern 'for line in file'. It didn't have module support,
IIRC, and only the regex module for regular expressions.
Ahh, and no string methods. That's very noticible when I
look at pre-2.0 code.

Okay, so the two main things that would make a 1.3-style
code stick out is using string.* and raising string exceptions.
In other words, I mostly agree with you, but couldn't let
you get a complete bye :)
But between early 70s BASIC and what passes for BASIC today looks like totally different languages.
I stopped about 10 years ago. I do recall reading an article by
a VB book author saying VB_(N+1) was too different from VB(N)
and the differences were driven by marketing and not for good
programming reasons.

There's also the TrueBASIC folks. I believe they started in the
mid-80s and argue their BASIC is essentially the same.
quick glance... Let a non-programmer look at source files from K&K
type BASIC, Visual BASIC, F-IV, F77, and F90... and he likely will be
able to identify the three Fortrans as being related -- but would not
consider K&K to be a relation of VB.


Interesting test. Nasty idea: get that same person to judge if
Lisp and Scheme are closely related then post the results on c.l.lisp.

Andrew
da***@dalkescientific.com
Jul 18 '05 #48

P: n/a
"Daniel Dittmar" <da************@sap.com> wrote previously:
|But a lot of Python code depends on reference counting or more exactly it
|depends on the timely call of the destructor. So even if a much better GC is
|added to Python, reference counting would perhaps be kept for backwards
|compatibility (see Python's biggest compromises)

Did this thread get caught in a time warp, and posts from two years ago
get posted again. Exactly this all happened years ago.

--
mertz@ | The specter of free information is haunting the `Net! All the
gnosis | powers of IP- and crypto-tyranny have entered into an unholy
..cx | alliance...ideas have nothing to lose but their chains. Unite
| against "intellectual property" and anti-privacy regimes!
-------------------------------------------------------------------------
Jul 18 '05 #49

P: n/a
"Daniel Dittmar" <da************@sap.com> wrote previously:
|But a lot of Python code depends on reference counting or more exactly it
|depends on the timely call of the destructor. So even if a much better GC is
|added to Python, reference counting would perhaps be kept for backwards
|compatibility (see Python's biggest compromises)

Did this thread get caught in a time warp, and posts from two years ago
get posted again. Exactly this all happened years ago.

--
mertz@ | The specter of free information is haunting the `Net! All the
gnosis | powers of IP- and crypto-tyranny have entered into an unholy
..cx | alliance...ideas have nothing to lose but their chains. Unite
| against "intellectual property" and anti-privacy regimes!
-------------------------------------------------------------------------

Jul 18 '05 #50

65 Replies

This discussion thread is closed

Replies have been disabled for this discussion.