469,315 Members | 1,959 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,315 developers. It's quick & easy.

The Industry choice

>From technical point of view, I could not understand the the reasoning
behind using Java in major companies. Sure that Python, is used in
some, but still Java is considered as a sure-job language.

After being a python programmer for long time, I consider it painful to
learn/use Java now (well, like many I will be forced to do that in my
job).

What makes such companies to choose Java over dynamic, productive
languages like Python? Are there any viable, technical reasons for
that?

Jul 18 '05
198 6736
cl****@lairds.us (Cameron Laird) writes:
That is, while I have a LOT of respect for Paul's programming
and judgment, and question myself when I'm on the side opposite
him, I ultimately value type declarations in languages such as
Java as more cost than benefit.


I don't find static type declarations to have much cost. It's just a
few more keystrokes. I'm open to persuasion about whether they have
benefit.

I do believe that it's a horrible deficiency in Python that it has no
declarations at all, even optional ones, like "perl -w" or "use
strict". Python's scoping hacks that result from the lack of
declarations just seem to me like pure insanity.

I was pretty skeptical of Java's checked exceptions when I first used
them but have been coming around about them. There's just been too
many times when I wrote something in Python that crashed because some
lower-level function raised an exception that the upper level hadn't
been expecting, after the program had been in use for a while. I'd
sure rather find out about that at compile time.
Jul 18 '05 #51
Paul Rubin wrote:
I don't find static type declarations to have much cost. It's just a
few more keystrokes. I'm open to persuasion about whether they have
benefit.


Overall I agree with you and would like to have OPTIONAL static type
declarations in Python, as has often been discussed. But without
facilities for generic programming, such as templates in C++, static
type declarations can force one to duplicate a LOT of code, with one
sorting routine each for integer, floats, strings, etc. Some algorithms
are type invariant, and Python is a concise language for expressing
those algorithms.

Jul 18 '05 #52
On Sat, 01 Jan 2005 16:08:07 GMT, cl****@lairds.us (Cameron
Laird) wrote:
I argue that it's a false opposition to categorize projects in
terms of use of single languages. Many projects are MUCH better
off with a mix


In practice I have *never* worked on an industrial scale project
that only used one language. The nearest I came was a small
protocol convertor that only used C, SQL and some shell and
awk - but that's still 4 languages! And the whole project was
only 40,000 lines of code in about 20 files.

And most projects use many more, I'd guess around 5-8 on an
"average project" of around 300-500kloc. The biggest project I
worked on had about 3.5Mloc and used:

Assembler (680x0 and Sparc),
C
C++
Lisp(Flavors)
awk
Bourne shell
C shell - this was a mistake discovered too late to "fix"
PL/SQL
???? - A UI description language for a tool called TeleUse...
Pascal - No, I don't know why...
ASN.1 - with a commercial compiler

We also had some IDL but since it was tool generated I'll ignore
it...

We also had an experimental version running on a NeXt box so it
used Objective C for the UI instead of ???? and C++...

A total of 13 languages... with 5 geographically dispersed teams
comprising a total of 200 developers (plus about 40 testers).
Interesting times...in the Chinese sense!

Alan G
Author of the Learn to Program website
http://www.freenetpages.co.uk/hp/alan.gauld
Jul 18 '05 #53
In article <7x************@ruckus.brouhaha.com>,
Paul Rubin <http://ph****@NOSPAM.invalid> wrote:

I was pretty skeptical of Java's checked exceptions when I first used
them but have been coming around about them. There's just been too
many times when I wrote something in Python that crashed because some
lower-level function raised an exception that the upper level hadn't
been expecting, after the program had been in use for a while. I'd
sure rather find out about that at compile time.


That's funny -- Bruce Eckel talks about how he used to love checked
exceptions but has come to regard them as the horror that they are.
I've learned to just write "throws Exception" at the declaration of
every method.
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

"19. A language that doesn't affect the way you think about programming,
is not worth knowing." --Alan Perlis
Jul 18 '05 #54
be*******@aol.com writes:
Overall I agree with you and would like to have OPTIONAL static type
declarations in Python, as has often been discussed. But without
facilities for generic programming, such as templates in C++, static
type declarations can force one to duplicate a LOT of code, with one
sorting routine each for integer, floats, strings, etc.


I don't see that big a problem. The current Python sorting routine
operates on instances of class "object" and calls the __cmp__ method
to do comparisons. Every class of sortable objects either defines a
__cmp__ method or inherits one from some superclass, and sort calls
those methods. Static type declarations would not require writing any
additional sorting routines.
Jul 18 '05 #55
Paul Rubin wrote:
I don't see that big a problem. The current Python sorting routine
operates on instances of class "object" and calls the __cmp__ method
to do comparisons. Every class of sortable objects either defines a
__cmp__ method or inherits one from some superclass, and sort calls
those methods. Static type declarations would not require writing any
additional sorting routines.


Python's list.sort doesn't check the *type* of the arguments at all. It only
looks for the relevant comparison methods (__cmp__ or __lt__, as I recall).

Sure, classes written in *Python* will ultimately inherit from either object or
types.ClassType, but extension classes need not do any such thing.

Yet list.sort works with them all, anyway.

Cheers,
Nick.

--
Nick Coghlan | nc******@email.com | Brisbane, Australia
---------------------------------------------------------------
http://boredomandlaziness.skystorm.net
Jul 18 '05 #56
Quoth Paul Rubin <http://ph****@NOSPAM.invalid>:
| be*******@aol.com writes:
|> Overall I agree with you and would like to have OPTIONAL static type
|> declarations in Python, as has often been discussed. But without
|> facilities for generic programming, such as templates in C++, static
|> type declarations can force one to duplicate a LOT of code, with one
|> sorting routine each for integer, floats, strings, etc.
|
| I don't see that big a problem. The current Python sorting routine
| operates on instances of class "object" and calls the __cmp__ method
| to do comparisons. Every class of sortable objects either defines a
| __cmp__ method or inherits one from some superclass, and sort calls
| those methods. Static type declarations would not require writing any
| additional sorting routines.

Yes, it would be really weird if Python went that way, and the
sort of idle speculations we were reading recently from Guido
sure sounded like he knows better. But it's not like there aren't
some interesting issues farther on downstream there, in the compare
function. cmp(), and str() and so forth, play a really big role in
Python's dynamically typed polymorphism. It seems to me they are
kind of at odds with static type analysis, especially if you want
type inference -- kind of a type laundering system, where you can't
tell what was supposed to be there by looking at the code. Some
alternatives would be needed, I suppose.

Donn Cave, do**@drizzle.com
Jul 18 '05 #57
"Donn Cave" <do**@drizzle.com> writes:
Yes, it would be really weird if Python went that way, and the
sort of idle speculations we were reading recently from Guido
sure sounded like he knows better. But it's not like there aren't
some interesting issues farther on downstream there, in the compare
function. cmp(), and str() and so forth, play a really big role in
Python's dynamically typed polymorphism. It seems to me they are
kind of at odds with static type analysis


I don't understand that. If I see "str x = str(3)", then I know that
x is a string.
Jul 18 '05 #58
Cameron Laird wrote:

Let me add a cautionary note, though: Big Companies,
including Oracle, Software AG, IBM, Cisco, and so on, have
adopted Tcl over and over. All of them still rely on Tcl
for crucial products. All of them also have employees who
sincerely wonder, "Tcl? Isn't that dead?"

I offer this as a counter-example to the belief that Adop-
tion by a heavyweight necessarily results in widespread
acceptance.
--

I think the adoption of computer languages is quite complex, but one usefulmetaphorical model may be gravity, e.g. the clumpy universe of stars, withgravity working on different scales to shape the overall distribution of matter. Adoption by a heavyweight may have some effect if that force is allowed to operate on other bodies, but the overall distribution of "mass" is complex.

In the practice of business, companies generally find a need to consciouslylimit methodological diversity as they grow in size. Control is usually made more centralized, but becomes more distant from the atom (programmer writing code) as the firm grows large, and entropy becomes the enemy, lower level entropy a source of uncertainty and risk. If so, there is some legitimate reason for trying to "standardize" on tools (i.e. programming languages).

Less sophisticated business minds may latch onto the notion of gains from economies of scale, which is usually an easy sell (and good route for a fastcareer rise) but an overly simple optimization.

Not to say that such restrictive mindsets and policies are inevitable, but they seem the prevailing wind.

Preserving intellectual diversity and innovation may be critical to a big company in the long run, and allowing the use of (for example) Python would seem very in tune with those goals.

It might be nice if it was widely understood (in IT) that Python was a language any competent programmer could pick up in an afternoon, such that Java, C, and Perl shops would not be concerned about the need for their staff to learn a new language.

Eric Pederson
"Linear? What is linear?"


Digital M4 (music) http://www.songzilla.blogspot.com
:::::::::::::::::::::::::::::::::::
domainNot="@something.com"
domainIs=domainNot.replace("s","z")
ePrefix="".join([chr(ord(x)+1) for x in "do"])
mailMeAt=ePrefix+domainIs
:::::::::::::::::::::::::::::::::::

Jul 18 '05 #59
Paul Rubin <http://ph****@NOSPAM.invalid> writes:

[...]
I don't understand that. If I see "str x = str(3)", then I know
that x is a string.


def foo(x):
return str(x)

str = foo(x)

And now, let's say that foo()'s definition is in another module.
It is hard for a programmer to quickly determine the type for str,
that's the problem with programming in languages that don't have
type declarations.
Jul 18 '05 #60
Paul Rubin wrote:
I do believe that it's a horrible deficiency in Python that it has no
declarations at all, even optional ones, like "perl -w" or "use
strict". Python's scoping hacks that result from the lack of
declarations just seem to me like pure insanity.


Yes, ignoring most of the debate about static vs. dynamic typing, I've
also longed for 'use strict'. Sure Python isn't as bad as (say) Awk in
this respect; you have to at least assign a variable to make it spring
into existence, but I've been bitten by typos there as well. Same when
it comes to object methods (I can often never remember my method names).

Pychecker helps to some extent, but I wouldn't mind a compiler that only
accepted identifiers that had been declared. I don't think that anyone
could argue that typing 'use apa' before the first actual use (or words
to that effect) would 'slow them down', or be very onerous.

Stefan,
--
Stefan Axelsson (email at http://www.cs.chalmers.se/~sax)
Jul 18 '05 #61
"Donn Cave" <do**@drizzle.com> writes:

[...]
For me, the effect is striking. I pound out a little program,
couple hundred lines maybe, and think "hm, guess that's it" and save
it to disk. Run the compiler, it says "no, that's not it - look
at line 49, where this expression has type string but context
requires list string." OK, fix that, iterate.


I believe program-specific unit tests are more effective than compiler
typechecking :)
Jul 18 '05 #62
> It might be nice if it was widely understood (in IT) that Python was
a language any competent
programmer could pick up in an afternoon


I am a programmer who works for a firm of engineers, where they program
in VBA, badly. I've often mentioned Python, whereupon I'm usually
dismissed as a crank. One of them expressed concern that if they used
Python and I left, then nobody would understand what to do. I could have
countered that Python is actually quite an easy language to pick up, but
what's the point.

We might be doing a project which involves web-type stuff. I pointed out
that if they did, they wouldn't be able to use VB/VBA, and may need to
use something like Python. I didn't get a reaction from that at the
time, but no doubt they'll be telling me that I'll have to make Excel
work through the internet, or something.
Jul 18 '05 #63
Paul Rubin <http://ph****@NOSPAM.invalid> writes:
Peter Dembinski <pd***@illx.org> writes:
If it has to be both reliable and secure, I suggest you used more
redundant language such as Ada 95.


That's something to think about and it's come up in discussions,
but probably complicates stuff since it's not currently available
on the target platform. Also, the people on the project have
significant Java and Python experience but haven't used Ada.
Do you think it has real advantages over Java?


As I wrote before, it is more redundant language[1], plus (AFAIR)
it has the strongest type checking of all the programming languages
I know about.

Plus, most of Ada compilers (such as gnat) generate machine/operating
system - specific code[2], not bytecode, which could be advantage
if performance is one of priorities.

I may have a little skewed viewpoint because Ada 95 is the language
I have recently studied on my RTS labs :>
[1] for example, one has to define interface and implementation parts
of each module in separate files

[2] AFAIR gnat generates C code, which is then compiled with gcc.
Jul 18 '05 #64
Mark Carter <mc*********@yahoo.co.uk> writes:
We might be doing a project which involves web-type stuff. I pointed
out that if they did, they wouldn't be able to use VB/VBA, and may
need to use something like Python.


They'll probably use vb.net.
Jul 18 '05 #65
Stefan Axelsson <cr******@hotmail.com> wrote:
Yes, ignoring most of the debate about static vs. dynamic typing, I've
also longed for 'use strict'.
You can use __slots__ to get the effect you're after. Well, sort of; it
only works for instance variables, not locals. And the gurus will argue
that __slots__ wasn't intended for that, so you shouldn't do it.
Sure Python isn't as bad as (say) Awk in this respect; you have to at
least assign a variable to make it spring into existence
I think you've hit the nail on the head. In awk (and perl, and most
shells, and IIRC, FORTRAN), using an undefined variable silently gets
you a default value (empty string or zero). This tends to propagate
errors and make them very difficult to track down.

In Python, you raise NameError or AttributeError, so you find out about
your mistake quickly, and you know exactly where it is. The only time
you can really go wrong is when you've got multiple assignment
statements with the same lhs and you make a typo in one of them. And
even then, as you say, things like Pychecker will probably catch the
mistake.

In perl, I always use "use strict", but in Python, I just don't feel the
need. Between the exception mechanism and unit tests, the odds of a
typo going unnoticed for very long are pretty slim. I'll admit I don't
use Pychecker, but if I was doing production code, I would probably use
it as part of my QA process.
I don't think that anyone could argue that typing 'use apa' before
the first actual use (or words to that effect) would 'slow them
down', or be very onerous.


Well, I'll have to (respectfully) disagree with you on that. It's not
that there's no value in explicit declarations, it's just that (IMHO)
the cost exceeds the value, given the other tools we have in Python to
catch the mistake.
Jul 18 '05 #66
Paul Rubin wrote:
Steve Holden <st***@holdenweb.com> writes:
It seems to me
that IDLE and a lot of the rest of Python are examples of someone
having a cool idea and writing a demo, then releasing it with a lot of
missing components and rough edges, without realizing that it can't
reasonably be called complete without a lot more work.


^Python^open source^

I wouldn't say so. I'd say the Linux kernel, GCC, Emacs, Apache,
Mozilla, etc. are all developed with a much more serious attitude than
Python is. Of course there are lots of other FOSS programs that
someone wrote for their own use and released, that are less polished
than Python, but that are also the subject of less advocacy than Python.


Well clearly there's a spectrum. However, I have previously written that
the number of open source projects that appear to get stuck somewhere
between release 0.1 and release 0.9 is amazingly large, and does imply
some dissipation of effort.

Give that there's no overall coordination this is of course inevitable,
but some open source projects are doomed from the start to be incomplete
because the original authors have never been involved in producing
software with a reasonably large user base, and so their production
goals and quite often their original specifications (where there are
any) are unrealistic.

These projects meander towards a half-assed initial implementation and
then become moribund.

This is not to tar respectable projects like Linux, many (but not all)
of the Gnu projects, and Python with that same brush, and personally I
think the Python *core* is pretty solid and quite well-documented, but I
don;t regard IDLE as part of the core myself. Since I'm not an active
developer, this may not be in line with python-dev's opinions on the matter.

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #67
Aahz wrote:
In article <7x************@ruckus.brouhaha.com>,
Paul Rubin <http://ph****@NOSPAM.invalid> wrote:
I was pretty skeptical of Java's checked exceptions when I first used
them but have been coming around about them. There's just been too
many times when I wrote something in Python that crashed because some
lower-level function raised an exception that the upper level hadn't
been expecting, after the program had been in use for a while. I'd
sure rather find out about that at compile time.

That's funny -- Bruce Eckel talks about how he used to love checked
exceptions but has come to regard them as the horror that they are.
I've learned to just write "throws Exception" at the declaration of
every method.


Pretty sloppy, though, no? And surely the important thing is to have a
broad handler, not a broad specification of raisable exceptions?

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #68
Mark Carter wrote:
> It might be nice if it was widely understood (in IT) that Python was

a language any competent
> programmer could pick up in an afternoon


I am a programmer who works for a firm of engineers, where they program
in VBA, badly. I've often mentioned Python, whereupon I'm usually
dismissed as a crank. One of them expressed concern that if they used
Python and I left, then nobody would understand what to do. I could have
countered that Python is actually quite an easy language to pick up, but
what's the point.

We might be doing a project which involves web-type stuff. I pointed out
that if they did, they wouldn't be able to use VB/VBA, and may need to
use something like Python. I didn't get a reaction from that at the
time, but no doubt they'll be telling me that I'll have to make Excel
work through the internet, or something.


They'll probably just move to .NET, which allows them to write .aspx
pages using VB.

regards
Steve
--
Steve Holden http://www.holdenweb.com/
Python Web Programming http://pydish.holdenweb.com/
Holden Web LLC +1 703 861 4237 +1 800 494 3119
Jul 18 '05 #69
Roy Smith wrote:
In perl, I always use "use strict", but in Python, I just don't feel the
need. Between the exception mechanism and unit tests, the odds of a
typo going unnoticed for very long are pretty slim. I'll admit I don't
use Pychecker, but if I was doing production code, I would probably use
it as part of my QA process.


Well, I don't have any experience with Python in the industrial setting
(all my Python has been solo so far). I do have quite a bit of
experience with Erlang (http://www.erlang.org) though, and while I agree
that it's not quite as bad in practice as the most vocal static typing
people would have it, it's not all roses either. The problem with unit
tests is that they can be skipped (and frequently are) and you also have
to be certain you exercise all code paths, even to detect a simple typo.

It's not that these survive for 'very long' or (God forbid) to the final
product, but many of them survive for long enough that they cost more
than they should/would have. So *if* (substantial 'if' I realise that)
my Erlang experiences generalises to this case, I'd say the benefits
would outweigh the cost.

Then again I'm seriously considering going back to Haskell, so I guess
I'm at least a little biased. :-) :-)

Stefan,
--
Stefan Axelsson (email at http://www.cs.chalmers.se/~sax)
Jul 18 '05 #70
In article <41**********@127.0.0.1>, Donn Cave <do**@drizzle.com> wrote:

I can only believe that if you think the benefit of static typing is
psychological, either something is very different between the way you
and I write programs, or you're not doing it right.

For me, the effect is striking. I pound out a little program, couple
hundred lines maybe, and think "hm, guess that's it" and save it to
disk. Run the compiler, it says "no, that's not it - look at line 49,
where this expression has type string but context requires list string."
OK, fix that, iterate. Most of this goes about as fast as I can edit,
sometimes longer, but it's always about structural flaws in my program,
that got there usually because I changed my mind about something in
midstream, or maybe I just mistyped something or forgot what I was doing.
Then, when the compiler is happy -- the program works. Not always, but
so much more often than when I write them in Python.


That's just not true for me. Take my recent Java experience (please!).
I spent much effort trying to resolve stupid type dependencies that made
no sense. Python's duck-typing just works -- if it looks like you should
be able to use an object for a particular operation, you probably can.
Python programs that I write mostly just work; instead of pounding out
two hundred lines of code straight, I keep adding stubs and filling them
in, testing operation as I go. This isn't even unit-testing -- I haven't
drunk that Kool-Aid yet.

This is easy because running a Python program is faster than invoking the
Java compiler -- and you still haven't tested the actual operation of
your Java program.
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

"19. A language that doesn't affect the way you think about programming,
is not worth knowing." --Alan Perlis
Jul 18 '05 #71
In article <xuTBd.66280$Jk5.42292@lakeread01>,
Steve Holden <st***@holdenweb.com> wrote:
Aahz wrote:
In article <7x************@ruckus.brouhaha.com>,
Paul Rubin <http://ph****@NOSPAM.invalid> wrote:

I was pretty skeptical of Java's checked exceptions when I first used
them but have been coming around about them. There's just been too
many times when I wrote something in Python that crashed because some
lower-level function raised an exception that the upper level hadn't
been expecting, after the program had been in use for a while. I'd
sure rather find out about that at compile time.


That's funny -- Bruce Eckel talks about how he used to love checked
exceptions but has come to regard them as the horror that they are.
I've learned to just write "throws Exception" at the declaration of
every method.


Pretty sloppy, though, no? And surely the important thing is to have a
broad handler, not a broad specification of raisable exceptions?


Yes, it's sloppy, but I Don't Care. I'm trying to write usable code
while learning a damnably under-documented Java library -- and I'm *not*
a Java programmer in the first place, so I'm also fighting with the Java
environment. Eventually I'll add in some better code.
--
Aahz (aa**@pythoncraft.com) <*> http://www.pythoncraft.com/

"19. A language that doesn't affect the way you think about programming,
is not worth knowing." --Alan Perlis
Jul 18 '05 #72
On Sat, 01 Jan 2005 13:28:16 -0600, "Rob Emmons"
<rm******@member.fsf.org> wrote:
For managers of companies it's worse: the company makes
VERY substantial investments into any technology it "marries",
and that means big losses if it goes. Long-term stability
of this technology in terms of "we're not going to be left out
in cold alone with this technology to feed it" means a lot
to them. Even a poor technology with external backing
of big, stable vendor is better than the excellent technology
without ......
There is the stability issue you mention... but also probably the fear
issue. If you choose a solution from a major company -- then it fails for
some reason or they drop the product -- it's their fault -- you've got an
automatic fall guy.


True. I have a bit of interest in economics, so I've seen e.g.
this example - why is it that foreign branches of companies
tend to cluster themselves in one city or country (e.g.
China right now)? According to standard economics it should
not happen - what's the point of getting into this overpriced
city if elsewhere in this country you can find just as good
conditions for business.

The point is obviously "cover your ass" attitude of managers:
if this investment fails, this manager can defend himself
"but everybody invested in that particular place, too, so
you see, at the time it was not a bad decision, we could
not predict... yadda yadda".


--
It's a man's life in a Python Programming Association.
Jul 18 '05 #73
On Sat, 01 Jan 2005 15:08:01 -0500, Steve Holden <st***@holdenweb.com>
wrote:
There is the stability issue you mention... but also probably the fear
issue. If you choose a solution from a major company -- then it fails for
some reason or they drop the product -- it's their fault -- you've got an
automatic fall guy. On the other hand, an open source solution or
otherwise less accepted solution ... it will probably be consider
your fault by the organization. It's a rational decision to avoid
personal risk when you don't get much reward for choosing something
different.
You are ignoring the fact that with the open source solution you do at
least have the option of hiring bright programmers to support the
framework which has now become moribund,
Theoretically. Because even though the source code is available
and free (like in beer as well as in speech) the work of
programmers isn't cheap.

This "free software" (not so much OSS) notion "but you can
hire programmers to fix it" doesn't really happen in practice,
at least not frequently: because this company/guy remains
ALONE with this technology, the costs are unacceptable.

It's a case of "marginal cost" (cost of making yet another copy)
becoming equals to the costs of a project: that is extraordinarily
expensive software. If this software gets sold or copied by
the millions, the marginal costs is going down to zero, like
it is the case with Linux.

Imagine NOT being a technology company (say, Sun or
IBM or Borland) and trying to hire programmers to fix
you the kernel of this operating system.
whereas when a company goes
bust there's no guarantee the software IP will ever be extricated from
the resulting mess.
There is a good _chance_ here: money. Somebody has poured a lot
of money into this thing. It's not going to get dropped bc of that.
So I'm not sure I'd agree with "rational" there, though "comprehensible"
might be harder to argue with.
It depends on definition of "rational", on definition of your or
company's goals and on the definitions of the situations that
are the context.

Avoidance of blame is way too large a motivator in large organizations,
and it leads to many forms of sub-optimal decision making.


This might be of interest to some people:

http://www.pkarchive.org/new/DefectiveInvestors.html

--
It's a man's life in a Python Programming Association.
Jul 18 '05 #74
On Fri, 31 Dec 2004 21:08:02 GMT, cl****@lairds.us (Cameron Laird)
wrote:
Let me add a cautionary note, though: Big Companies,
including Oracle, Software AG, IBM, Cisco, and so on, have
adopted Tcl over and over. All of them still rely on Tcl
for crucial products. All of them also have employees who
sincerely wonder, "Tcl? Isn't that dead?" I offer this as a counter-example to the belief that Adop-
tion by a heavyweight necessarily results in widespread
acceptance.


It's a quiet adoption. It's not a firework show a la Java
combined with blowing all those truckloads of money on
it (I've read this good comment in one of the IT periodicals
that in future Java will be cited as an example of work of
genius - not a genius of computing, though, but of marketing).

There is a rational element in this craziness: people
around watch and see that $$$ has been sunk in
this, so they know that this who sank all those $$$
has very, very much of motivation not to let this
thing wither away. No guarantee, of course, but a
much better chance they just won't drop it.

The problem of Python is not that it's not used, but
because some company like IBM didn't decide to blow
$1 bln on it or smth. Using it alone and even announcing
it is not enough. "put your money where your mouth
is", or so the thinking goes.


--
It's a man's life in a Python Programming Association.
Jul 18 '05 #75
In article <41***********************@ptn-nntp-reader03.plus.net>,
Mark Carter <mc*********@yahoo.co.uk> wrote:
Jul 18 '05 #76
cl****@lairds.us (Cameron Laird) wrote:
Let me add a cautionary note, though: Big Companies,
including Oracle, Software AG, IBM, Cisco, and so on, have
adopted Tcl over and over. All of them still rely on Tcl
for crucial products. All of them also have employees who
sincerely wonder, "Tcl? Isn't that dead?"


A lot of people laugh at Tcl, but it's really a very useful tool. In my
last job, we did a major component of our product (SNMP-based network
management package) in Tcl, probably on the order of 10 kloc. It has
its limitations, but it's very easy to learn, very easy to embed and
extend, and reasonably fast. It certainly blows away shell scripting.

Around here, AOL/Moviephone has been trolling for years for Tcl people;
I guess that counts as a big company.
Jul 18 '05 #77
Roy Smith wrote:
I think you've hit the nail on the head. In awk (and perl, and most
shells, and IIRC, FORTRAN), using an undefined variable silently gets
you a default value (empty string or zero). This tends to propagate
errors and make them very difficult to track down.


You may recall correctly, but Fortran compilers have improved. The
following Fortran 90 program

integer, parameter :: n = 1
real :: x,y=2.0,z(n)
print*,"dog"
print*,x
z(n+1) = 1.0
print*,z
end

has 3 errors, all detected at compile time by the Lahey/Fujitsu Fortran
95 compiler, with the proper options:

2004-I: "xundef.f", line 2: 'y' is set but never used.
2005-W: "xundef.f", line 4: 'x' is used but never set.
2153-W: "xundef.f", line 5, column 1: Subscript out of range.

At run time, the output is

dog
The variable (x) has an undefined value.
Error occurs at or near line 4 of _MAIN__

Running Python 2.4 on the Python analog,

n = 1
y = 2.0
z = range(n)
print "dog"
print x
z[n] = 1.0
print z

one error is caught:

dog
Traceback (most recent call last):
File "xundef.py", line 5, in ?
print x
NameError: name 'x' is not defined

You will see the out-of-bounds error for z only after fixing the
undefined-x error. No warning is ever given about y, which is set but
never used. In practice, 'print "dog"' could be some operation taking
hours. Can PyChecker find all the problems in a single run, without
executing 'print "dog"'? If so, it would be great if it were integrated
with the CPython interpreter.

One reason interpreted languages like Python are recommended to
beginners is to avoid the edit/compile/debug cycle. But I think it is
faster and less frustrating to have many errors caught in one shot.

Jul 18 '05 #78
In article <cr**********@panix3.panix.com>, aa**@pythoncraft.com (Aahz)
wrote:
In article <xuTBd.66280$Jk5.42292@lakeread01>,
Steve Holden <st***@holdenweb.com> wrote:
Aahz wrote:
In article <7x************@ruckus.brouhaha.com>,
Paul Rubin <http://ph****@NOSPAM.invalid> wrote:

I was pretty skeptical of Java's checked exceptions when I first used
them but have been coming around about them. There's just been too
many times when I wrote something in Python that crashed because some
lower-level function raised an exception that the upper level hadn't
been expecting, after the program had been in use for a while. I'd
sure rather find out about that at compile time.

That's funny -- Bruce Eckel talks about how he used to love checked
exceptions but has come to regard them as the horror that they are.
I've learned to just write "throws Exception" at the declaration of
every method.


Pretty sloppy, though, no? And surely the important thing is to have a
broad handler, not a broad specification of raisable exceptions?


Yes, it's sloppy, but I Don't Care. I'm trying to write usable code
while learning a damnably under-documented Java library -- and I'm *not*
a Java programmer in the first place, so I'm also fighting with the Java
environment. Eventually I'll add in some better code.


The whole point of exceptions is that they get propagated automatically.
If I'm not going to catch it, why do I have to even know it exists? I
don't consider "throws Exception" to be sloppy, I consider it to be
programmers voting with their feet.
Jul 18 '05 #79
Cameron Laird wrote:
In article <41***********************@ptn-nntp-reader03.plus.net>,
Mark Carter <mc*********@yahoo.co.uk> wrote:
.
[tale of *very*
typical experience
with non-software
engineers]
.
.

Don't start me! Dammit, too late ...

I've noticed that they have an overwhelming obsession with GUIs, too.
They design wizards for everything. Damn pretty they are, too. Albeit a
bit flakey. They seem to conflate pretty interfaces with good interfaces
and good software.

I used to joke that since our software wasn't particularly magical, it
didn't need wizards. But I think I just ended up sounding bitter.

We once had a bit of software that we thought we'd like to turn into a
generic application. The focus on improvements was, predictably enough,
that we should design a GUI that could do anything a client would likely
to want to do. It was my opinion, though, having seen the very
"special-cases" nature required in the original software, that it was
almost impossible to predict exactly how a customer might want the
product tailored. I suggested that what they really needed was a library
(Python would have been good for this, Lisp might have been even better)
that could be extended as required. GUIs second, functionality first.
But hey, what would I know. Fortunately, the whole thing's been put on
the back burner.

And trying to get through to them why source control makes sense, that
when more than one person works on a project, some form of coordination
is required, that copying and pasting code is evil, and that Excel
probably isn't the hammer for every nail.

Honestly, I thought (real) engineers were supposed to be clever.
Jul 18 '05 #80
Quoth Paul Rubin <http://ph****@NOSPAM.invalid>:
| "Donn Cave" <do**@drizzle.com> writes:
|> Yes, it would be really weird if Python went that way, and the
|> sort of idle speculations we were reading recently from Guido
|> sure sounded like he knows better. But it's not like there aren't
|> some interesting issues farther on downstream there, in the compare
|> function. cmp(), and str() and so forth, play a really big role in
|> Python's dynamically typed polymorphism. It seems to me they are
|> kind of at odds with static type analysis
|
| I don't understand that. If I see "str x = str(3)", then I know that
| x is a string.

Sure, but the dynamically typed polymorphism in that function is
about its parameters, not its result. If you see str(x), you can't
infer the type of x. Of course you don't need to, in Python style
programming this is the whole point, and even in say Haskell there
will be a similar effect where most everything derives the Show
typeclass. But this kind of polymorphism is pervasive enough in
Python's primitive functions that it's an issue for static type
analysis, it seems to me, especially of the type inference kind.
cmp() is more of a real issue than str(), outside of the type
inference question. Is (None < 0) a valid expression, for example?

Donn Cave, do**@drizzle.com
Jul 18 '05 #81
Peter Dembinski <pd***@illx.org> writes:

[...]
str = foo(x)


(ick!) it should be:

bar = foo(x)
Jul 18 '05 #82
Peter Dembinski <pd***@illx.org> writes:
Peter Dembinski <pd***@illx.org> writes:

[...]
str = foo(x)


(ick!) it should be:

bar = foo(x)


Besides, shouldn't str be a reserved word or something?
Jul 18 '05 #83
Bulba! <bu***@bulba.com> writes:

[...]
The point is obviously "cover your ass" attitude of managers:


Managers get paid for taking risk :)
Jul 18 '05 #84

"Peter Dembinski" <pd***@illx.org> wrote in message
news:87************@hector.domek...
Besides, shouldn't str be a reserved word or something?


It is a name in the builtins module which is automatically searched after
globals. Many experienced Pythoneers strongly advise against rebinding
builtin names *unless* one is intentionally wrapping or overriding the
builtin object. The latter are sometimes valid expert uses of masking
builtins. Newbies are regularly warned on this list against making a habit
of casual use of list, dict, int, str, etc.

None has been reserved because there is no known good use for overriding
it. True and False will be reserved someday. There have been proposals to
turn on reserved status for all builtins on a per-module status.

Terry J. Reedy

Jul 18 '05 #85
Roy Smith <ro*@panix.com> wrote:
Stefan Axelsson <cr******@hotmail.com> wrote:
Yes, ignoring most of the debate about static vs. dynamic typing, I've
also longed for 'use strict'.


You can use __slots__ to get the effect you're after. Well, sort of; it
only works for instance variables, not locals. And the gurus will argue
that __slots__ wasn't intended for that, so you shouldn't do it.


There's a simple, excellent recipe by Michele Simionato, on both the
online and forthcoming 2nd edition printed Cookbook, showing how to do
that the right way -- with __setattr__ -- rather than with __slots__ .
Alex
Jul 18 '05 #86

<be*******@aol.com> wrote in message
news:11**********************@c13g2000cwb.googlegr oups.com...
2004-I: "xundef.f", line 2: 'y' is set but never used. 2005-W: "xundef.f", line 4: 'x' is used but never set.
2153-W: "xundef.f", line 5, column 1: Subscript out of range.


None of these are syntax errors. The first two of these would be caught by
lint or pychecker (I am presuming).
One reason interpreted languages like Python are recommended to
beginners is to avoid the edit/compile/debug cycle. But I think it is
faster and less frustrating to have many errors caught in one shot.


True syntax errors often result in such a cascade of bogus errors that it
may often be best to fix the first reported error and then recompile. Of
course, compilers vary in their recovery efforts.

Terry J. Reedy

Jul 18 '05 #87

"Bulba!" <bu***@bulba.com> wrote in message
news:qc********************************@4ax.com...
On Sat, 01 Jan 2005 15:08:01 -0500, Steve Holden >
whereas when a company goes
bust there's no guarantee the software IP will ever be extricated from
the resulting mess.
There is a good _chance_ here: money. Somebody has poured a lot
of money into this thing. It's not going to get dropped bc of that.

From what I have read, the amount of proprietary code which *did* get

effectively shredded after the dot-com bust is enough to make one cry.
There were a few companies that would buy code at bankruptcy sales for
maybe 1% of its development cost, but even then, with the original
programmers long gone, it could be hard to make anything from it.

Terry J. Reedy

Jul 18 '05 #88
"Terry Reedy" <tj*****@udel.edu> wrote:
None has been reserved because there is no known good use for overriding
it.
Should I infer from the above that there's a known bad use?
True and False will be reserved someday.


I remember a lisp I used many years ago. If you tried to rebind nil,
you got an error message -- in latin!
Jul 18 '05 #89

"Steve Holden" <st***@holdenweb.com> wrote in message
news:_rTBd.66275$Jk5.46@lakeread01...
Well clearly there's a spectrum. However, I have previously written that
the number of open source projects that appear to get stuck somewhere
between release 0.1 and release 0.9 is amazingly large, and does imply
some dissipation of effort.


And how do the failure and effort dissipation rates of open source code
compare to those of closed source code? Of course, we have only anecdotal
evidence that the latter is also 'amazingly large'. And, to be fair, the
latter should include the one-programmer proprietary projects that
correspond to the one-programmer open projects.

Also, what is 'amazing' to one depends on one's expectations ;-). It is
known, for instance, that some large fraction of visible retail business
fail within a year. And that natural selection is based on that fact that
failure is normal.

Terry J. Reedy

Jul 18 '05 #90
Bulba! <bu***@bulba.com> wrote:
True. I have a bit of interest in economics, so I've seen e.g.
this example - why is it that foreign branches of companies
tend to cluster themselves in one city or country (e.g.
It's not just _foreign_ companies -- regional clustering of all kinds of
business activities is a much more widespread phenomenon. Although I'm
not sure he was the first to research the subject, Tjalling Koopmans, as
part of his lifework on normative economics for which he won the Nobel
Prize 30 years ago, published a crucial essay on the subject about 50
years ago (sorry, can't recall the exact date!) focusing on
_indivisibilities_, leading for example to transportation costs, and to
increasing returns with increasing scale. Today, Paul Krugman is
probably the best-known name in this specific field (he's also a
well-known popularizer and polemist, but his specifically-scientific
work in economics has mostly remained in this field).
China right now)? According to standard economics it should
not happen - what's the point of getting into this overpriced
city if elsewhere in this country you can find just as good
conditions for business.


Because you can't. "Standard" economics, in the sense of what you might
have studied in college 25 years ago if that was your major, is quite
able to account for that if you treat spatial variables as exogenous to
the model; Krugman's breakthroughs (and most following work, from what I
can tell -- but economics is just a hobby for me, so I hardly have time
to keep up with the literature, sigh!) have to do with making them
endogenous.

Exogenous is fine if you're looking at the decision a single firm, the
N+1 - th to set up shop in (say) a city, faces, given decisions already
taken by other N firms in the same sector.

The firm's production processes have inputs and outputs, coming from
other firms and (generally, with the exception of the last "layer" of
retailers etc) going to other firms. Say that the main potential buyers
for your firm's products are firms X, Y and Z, whose locations all
"happen to be" (that's the "exogenous" part) in the Q quarter of town.
So, all your competitors have their locations in or near Q, too. Where
are you going to set up your location? Rents are higher in Q than
somewhere out in the boondocks -- but being in Q has obvious advantages:
your salespeople will be very well-placed to shuttle between X, Y, Z and
your offices, often with your designers along so they can impress the
buyers or get their specs for competitive bidding, etc, etc. At some
points, the competition for rents in quarter Q will start driving some
experimenters elsewhere, but they may not necessarily thrive in those
other locations. If, whatever industry you're in, you can strongly
benefit from working closely with customers, then quarter Q will be
where many firms making the same products end up (supply-side
clustering).

Now consider a new company Z set up to compete with X, Y and Z. Where
will THEY set up shop? Quarter Q has the strong advantage of offering
many experienced suppliers nearby -- and in many industries there are
benefits in working closely with suppliers, too (even just to easily
have them compete hard for your business...). So, there are easily
appreciated exogenous models to explain demand-side clustering, too.

That's how you end up with a Holliwood, a Silicon Valley, a Milan (for
high-quality fashion and industrial design), even, say, on a lesser
scale, a Valenza Po or an Arezzo for jewelry. Ancient European cities
offer a zillion examples, with streets and quarters named after the
trades or professions that were most clustered there -- of course, there
are many other auxiliary factors related to the fact that people often
_like_ to associate with others of the same trade (according to Adam
Smith, generally to plot some damage to the general public;-), but
supply-side and demand-side, at least for a simpler exogenous model, are
plenty.

Say that it's the 18th century (after the corporations' power to stop
"foreign" competition from nearby towns had basically waned), you're a
hat-maker from Firenze, and for whatever reason you need to move
yourself and your business to Bologna. If all the best hat-makers'
workshops and shops are clustered around Piazza dell'Orologio, where are
YOU going to set up shop? Rents in that piazza are high, BUT - that's
where people who want to buy new hats will come strolling to look at the
displays, compare prices, and generally shop. That's close to where
felt-makers are, since they sell to other hat-makers. Should your
business soon flourish, so you'll need to hire a worker, that's where
you can soon meet all the local workers, relaxing with a glass of wine
at the local osteria after work, and start getting acquainted with
everybody, etc, etc...

Risk avoidance is quite a secondary issue here (except if you introduce
in your model an aspect of imperfect-information, in which case,
following on the decisions made by locals who may be presumed to have
better information than you is an excellent strategy). Nor is there any
"agency problem" (managers acting for their interests and against the
interest of owners), not a _hint_ of it, in fact -- the hatmaker acting
on his own behalf is perfectly rational and obviously has no agency
problem!).

So, I believe that introducing agency problems to explain clustering is
quite redundant and distracting from what is an interesting sub-field of
(quite-standard, by now) economics.

There are quite a few other sub-fields of economics where agency
problems, and specifically the ones connected with risk avoidance, have
far stronger explicatory power. So, I disagree with your choice of
example.
Alex
Jul 18 '05 #91
Bulba! <bu***@bulba.com> writes:
This "free software" (not so much OSS) notion "but you can
hire programmers to fix it" doesn't really happen in practice,
at least not frequently: because this company/guy remains
ALONE with this technology, the costs are unacceptable.


Yes, but fixing python software - even sloppily written python
software - is pretty easy. I regularly see contracts to add a feature
to or fix a bug in some bit of OSS Python.

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Jul 18 '05 #92
Roy Smith <ro*@panix.com> writes:
Around here, AOL/Moviephone has been trolling for years for Tcl people;
I guess that counts as a big company.


The AOL web server also uses tcl as a built-in dynamic content
generation language (i.e. sort of like mod_python), or at least it
used to.
Jul 18 '05 #93
Roy Smith wrote:
"Terry Reedy" <tj*****@udel.edu> wrote:
None has been reserved because there is no known good use for overriding
it.


Should I infer from the above that there's a known bad use?


Yes: making None equal to the integer 3. That's one of
six known bad uses.... it's possible there are more. ;-)

-Peter
Jul 18 '05 #94
Paul Rubin schreef:
The AOL web server also uses tcl as a built-in dynamic content
generation language (i.e. sort of like mod_python), or at least it
used to.


It still does:
"""
AOLserver is America Online's Open-Source web server. AOLserver is the
backbone of the largest and busiest production environments in the world.
AOLserver is a multithreaded, Tcl-enabled web server used for large scale,
dynamic web sites.
"""

<http://www.aolserver.com/>

--
JanC

"Be strict when sending and tolerant when receiving."
RFC 1958 - Architectural Principles of the Internet - section 3.9
Jul 18 '05 #95
Peter Hansen <pe***@engcorp.com> writes:
Roy Smith wrote:
"Terry Reedy" <tj*****@udel.edu> wrote:
None has been reserved because there is no known good use for
overriding it.

Should I infer from the above that there's a known bad use?


Yes: making None equal to the integer 3. That's one of
six known bad uses.... it's possible there are more. ;-)


Binding user variables to these names should raise exception
(eg. AreYouInsaneException or WhatAreYouDoingException) :>
Jul 18 '05 #96
In article <1g****************************@yahoo.com>,
Alex Martelli <al*****@yahoo.com> wrote:
Bulba! <bu***@bulba.com> wrote:
True. I have a bit of interest in economics, so I've seen e.g.
this example - why is it that foreign branches of companies
tend to cluster themselves in one city or country (e.g.


It's not just _foreign_ companies -- regional clustering of all kinds of
business activities is a much more widespread phenomenon. Although I'm
not sure he was the first to research the subject, Tjalling Koopmans, as
part of his lifework on normative economics for which he won the Nobel
Prize 30 years ago, published a crucial essay on the subject about 50
years ago (sorry, can't recall the exact date!) focusing on
_indivisibilities_, leading for example to transportation costs, and to
increasing returns with increasing scale. Today, Paul Krugman is
probably the best-known name in this specific field (he's also a
well-known popularizer and polemist, but his specifically-scientific
work in economics has mostly remained in this field).

Jul 18 '05 #97
In article <ma*************************************@python.or g>,
Terry Reedy <tj*****@udel.edu> wrote:

"Steve Holden" <st***@holdenweb.com> wrote in message
news:_rTBd.66275$Jk5.46@lakeread01...
Well clearly there's a spectrum. However, I have previously written that
the number of open source projects that appear to get stuck somewhere
between release 0.1 and release 0.9 is amazingly large, and does imply
some dissipation of effort.


And how do the failure and effort dissipation rates of open source code
compare to those of closed source code? Of course, we have only anecdotal
evidence that the latter is also 'amazingly large'. And, to be fair, the
latter should include the one-programmer proprietary projects that
correspond to the one-programmer open projects.

Also, what is 'amazing' to one depends on one's expectations ;-). It is
known, for instance, that some large fraction of visible retail business
fail within a year. And that natural selection is based on that fact that

Jul 18 '05 #98
In article <11**********************@c13g2000cwb.googlegroups .com>,
<be*******@aol.com> wrote:
Roy Smith wrote:
I think you've hit the nail on the head. In awk (and perl, and most
shells, and IIRC, FORTRAN), using an undefined variable silently gets
you a default value (empty string or zero). This tends to propagate
errors and make them very difficult to track down.


You may recall correctly, but Fortran compilers have improved. The
following Fortran 90 program

integer, parameter :: n = 1
real :: x,y=2.0,z(n)
print*,"dog"
print*,x
z(n+1) = 1.0
print*,z
end

has 3 errors, all detected at compile time by the Lahey/Fujitsu Fortran
95 compiler, with the proper options:

2004-I: "xundef.f", line 2: 'y' is set but never used.

Jul 18 '05 #99
In article <41***********************@ptn-nntp-reader01.plus.net>,
Mark Carter <mc*********@yahoo.co.uk> wrote:
Jul 18 '05 #100

This discussion thread is closed

Replies have been disabled for this discussion.

By using this site, you agree to our Privacy Policy and Terms of Use.