By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,172 Members | 762 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,172 IT Pros & Developers. It's quick & easy.

Python Productivity Gain?

P: n/a
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.

From my our personal experience, I have been programming with Python for
about 6 months (but I have been programming in other languages for over 10
years) and I have noticed that the more I had gotten use to programming in
Python, the more my programming speed has increased. But ... this is true
with any language that you program in as long as you are learning the
methodologies and concepts of the programming language. Your thoughts.

Kevin
Jul 18 '05 #1
Share this Question
Share on Google+
38 Replies


P: n/a
kbass <kb***@midsouth.rr.com> wrote:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.

From my our personal experience, I have been programming with Python for
about 6 months (but I have been programming in other languages for over 10
years) and I have noticed that the more I had gotten use to programming in
Python, the more my programming speed has increased. But ... this is true
with any language that you program in as long as you are learning the
methodologies and concepts of the programming language. Your thoughts.


It used to be that Python programs were shorter, faster, readable,
writable, and simply better. But, this was during the days when most
programmers had Unix background. Nowdays, most of the programmers are
coming from Windows background, and Python programs have become as
verbose and unreadable as Visual Basic or Perl.

Ruby has not been corrupted as such. It make complicated thing less
complicated. But, it still make simply thing not as simple as Python.

--
William Park, Open Geometry Consulting, <op**********@yahoo.ca>
Linux solution for data management and processing.
Jul 18 '05 #2

P: n/a

"William Park" <op**********@yahoo.ca> wrote in message
news:c0*************@ID-99293.news.uni-berlin.de...
kbass <kb***@midsouth.rr.com> wrote:
In different articles that I have read, persons have constantly eluded to the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me some real life examples of productivity gains using Python as opposed other programming languages.

From my our personal experience, I have been programming with Python for
about 6 months (but I have been programming in other languages for over 10 years) and I have noticed that the more I had gotten use to programming in Python, the more my programming speed has increased. But ... this is true with any language that you program in as long as you are learning the
methodologies and concepts of the programming language. Your thoughts.


It used to be that Python programs were shorter, faster, readable,
writable, and simply better. But, this was during the days when most
programmers had Unix background. Nowdays, most of the programmers are
coming from Windows background, and Python programs have become as
verbose and unreadable as Visual Basic or Perl.

Ruby has not been corrupted as such. It make complicated thing less
complicated. But, it still make simply thing not as simple as Python.

--
William Park, Open Geometry Consulting, <op**********@yahoo.ca>
Linux solution for data management and processing.


IMHO, the overall productivity gain of any programming language comes from
the programmers that are programming in the language. Productivity gain
comes from individuals and is not language based.

I thought that Python didn't allow for sloppy programming techniques due to
it's lines of code being indented throughout programs. Have Windows
programmers found a way pass this? Are you saying that some programmers with
a Windows programming background have poor programming practices when
compared to their Unix programming background counterparts?

Kevin
Jul 18 '05 #3

P: n/a
> In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases.


I have no data on this and can't say with respect to Java. I can give
my personal feeling and this is more compared to C.

I'd look at productivity in a few ways:

* How many lines of code does it take. More code lines really does
mean longer to program and more chance of errors.
* How hard it is to write code -- to be specific java and C have all
these brackets which have to be tracked down... this does take time...
though perhaps a good IDE handles that for you.
* Complexity of errors possible -- for example in C (but maybe not so
much in Java)-- un-initialized variables and memory leaks, etc are big
problems. Often very hard to find. You often have to use complex
debugging tools to test this sort of code an look for memory corruption.

My experience when writing an application is that all of these things
are favorable for python. You generate code fast, the syntax is such
that you don't have to track down missing components like brackets, and
when you write the code there are usually very few runtime errors --
except for logic errors -- which are usually obvious.

In terms of free/open environments python is also one of the best.

I do agree with some of the other posters -- actual productivity will
depend on skill, and the power of the tools you use to build the app. C
and Java may have better commercial development tools and programs may
be better at using these.

All in all though, I'm a python guy for many reasons.

Rob

P.S. Sorry if you got more than one post--I'm having trouble with my
news client.
Jul 18 '05 #4

P: n/a
I think the main gain is from the lack of the compilation/linking
process.

I have C++/Java pogramming friends (I've dabbled too) and the biggest
hinderance to getting an app up and running is the "change code ->
compile+link (read the newspaper) -> test, doesn't work, change code
-> compile+link (make some coffee) -> test" cycle

Being able to save+run is great with scripting languages. This is why
a lot of C++/Qt programmers I know are switching to Python+PyQt or
Qt+QSA, as in these 3+GHz days, we don't really need the extra speed
from compilation, but saving a few hours of expensive programmer's
time is useful.

Other niceties are on-the-fly debugging/errors - Python and Perl have
this, PHP is useless at debugging, C/C++ just core dumps....

A prime example was the other day at work - I wrote a console utility,
then wrapped a TKinter GUI around it, which took about 10 minutes to
do. I then decided to add a feature which needed another widget that
TKinter doesn't have, so converted it to PyQt, I then found that a
colleauge wanted a copy for Windows, but we didn't have a commercial
license, so converted it to wxPython, all of which took under an hour,
I didn't even use QtDesigner/wxDesigner for my forms.

My C++ programming friend said that would have taken him most of a
morning to do, and then he'd have to build a separate copy for each of
Solaris, Windows and Linux.
Jul 18 '05 #5

P: n/a
> In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.

From my our personal experience, I have been programming with Python for
about 6 months (but I have been programming in other languages for over 10
years) and I have noticed that the more I had gotten use to programming in
Python, the more my programming speed has increased. But ... this is true
with any language that you program in as long as you are learning the
methodologies and concepts of the programming language. Your thoughts.


Before I give my opinions, I should probably give some background. The
first semester of my freshman year of college, I (and many others) were
taught Scheme. After 'mastering' Scheme, we moved on to C and C++ the
following spring. Within weeks of learning C (the semester started
~February 1, competition was March 19), I competed in a local
programming competition to decide who was going to participate in the
following fall' (of 1999) regional ACM programming competition. My
partner and I placed second in the local competition, ahead of various
sophomores and juniors, but didn't do quite so well in the regional ACM
competition.

A year later, in the early spring of 2000 (sophomore year), I found
Python. In a week, I rewrote every programming assignment I had during
the previous year and a half of undergraduate CS, from Scheme or C/C++
to Python.

Not only had I learned Python, and translated programs to Python, but I
was able to write new software that I hadn't even considered before.
Multithreading, sockets, interesting data structures, databases...the
world was my oyster. Two years later, in the spring of 2002, I was
finishing the 4th rewrite of an as-of-yet unreleased (the quality was
shit, I'm still looking for time to re-do it) parallel processing
library, similar to MPI. The 4th rewrite took a total of a week and a
half, for 4500 lines of Python. Functionally, every feature of MPI was
included, written in ~10 days. I had known Python for 2 years.
Fast forward to now, I've known Python for 4 years. The only project
I've written in C since, is a password cracker (modified l0phtcrack),
that I had contemplated making into an independent project during
undergrad. Every other piece of code that I produce on a daily basis,
for teaching Introductory Algorithm Design and Analysis (this quarter is
my 5th as a TA for the course), for database classes, algorithmic
theory, etc., or any time I need some tool, Python is what I build it with.

They say that when you have a hammer, everything looks like a nail.
When wsing Python, most everything /is/ a nail. Those things that are
not nails, usually lie in the realm of different programming paradigms
(like logic programming with Prolog, etc.), or are very processor
intensive, and are not suitable for an interpreted language. Psyco
works well for a first step to stave off "not fast enough", and I hear
that Pyrex is a great second step (I've had no need for Pyrex yet).

In any case, I don't believe that I would have been able to do or learn
nearly as much, had I not had Python.

- Josiah
Jul 18 '05 #6

P: n/a
"kbass" <kb***@midsouth.rr.com> wrote in message news:<La*****************@fe3.columbus.rr.com>...
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.


There is lot that can be said about why Python is more productive.
I'll make one point that has not yet been mentioned in this thread.

I like to say that Python has a very good impedance match with the
mind (mine at least :). Normally when you program, you:

1. Analyze the problem
2. Come up with a 'solution in mind'
3. Translate the 'solution in mind' into 'solution in code'

It's 3 where Python shines. For a concrete example, consider looping
over a list of items. Apart from the looping object and the list, you
need:

Java: An iterator
C: A counter
Perl: A lot of '$' signs :)
Python: Nothing. (for item in mylist:)

Notice how Python keeps it explicit yet minimal, and there's nothing
more in your mind than what is absolutely necessary ("fits you
brain"). As a real life analogy, when I make a multi-egg omelette, for
each egg in bunch_of_eggs, I pan.put(egg.break()). If I had to think
of an iterator, or a counter, it would be lunchtime before I'd have
breakfast.

In other words, Python *is* how I think. This is what I mean by a good
impedance match.

Note that you can still use an iterator in Python (but only if and
when you need it).

For another example, consider nested lists in Perl and Python. In Perl
one struggles with the language, in Python one struggles with only the
problem.

In other languages I've used (Java, C++, Perl), I always get
sidetracked into syntax issues ("oh I have to write it *this* way!"),
library issues ("oh I have to import this and that first!") and such,
all of which consume my brainpower leaving little for the problem at
hand. These little things add up and affect productivity tremendously.
Python has few of these and I've always found myself rushing to Python
as the first choice to implement any algorithm, solution, or idea that
I had. It is the shortest path to the program.

Some newcomers are so used to thinking in roundabout ways that they
write C or Java code in Python. I did too - using counters for loops,
creating way too many classes etc. But eventually everyone learns the
power of simplicity.

Naturally, reading a Python program and figuring out what it does is
fast too, greatly improving maintainability.

I'm writing an article on Python productivity, which I'll post here at
some point.

Cheers,
Shalabh
Jul 18 '05 #7

P: n/a
"kbass" <kb***@midsouth.rr.com> writes:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.


In every language community you will find people claiming that their
favorite language X is 5-10 times more productive than alternatives.
Typically this is justified by personal anecdotes.

A scientific study in this regard is

@misc{ prechelt-empirical,
author = "Lutz Prechelt",
title = "An empirical comparison of C, C++, Java, Perl, Python, Rexx, and Tcl for a search/string-processing program",
url = "citeseer.nj.nec.com/547865.html" }

This study reports a programmer productivity gain of about 2 for all
the scripting languages over C, C++, and Java. It is a pity that not
more attempts to gain /unbiased/ information on software productivity
are undertaken.
Jul 18 '05 #8

P: n/a
Am Sun, 15 Feb 2004 02:53:31 +0000 schrieb kbass:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.


I think programming python is more productive since the
code is much more easy to read. You can solve the same
problem with less characters in perl, but after one year,
you will prefere the python version.

http://pleac.sf.net has some examples on several languages.
thomas

Jul 18 '05 #9

P: n/a
William Park wrote:

It used to be that Python programs were shorter, faster, readable,
writable, and simply better. But, this was during the days when most
programmers had Unix background. Nowdays, most of the programmers are
coming from Windows background, and Python programs have become as
verbose and unreadable as Visual Basic or Perl.

Ruby has not been corrupted as such. It make complicated thing less
complicated. But, it still make simply thing not as simple as Python.


With all respect, William, this sounds to me like a big stinking pile
of rubbish. Do you have *any* factual basis for the claims you are
making?

In my opinion, the Unix background of a programmer would, if anything,
tend to increase the likelihood of the code being unreadable (Perl, after
all, comes from that neck of the woods), but I really think it much
more likely that the OS background has little to do with it. Python
itself is what is different, and what produces more readable code,
and the Python community encourages and directs that trend.

As for "most programmers are coming from Windows background" I suspect
that too is very debatable.

-Peter
Jul 18 '05 #10

P: n/a
Matthias wrote:

This study reports a programmer productivity gain of about 2 for all
the scripting languages over C, C++, and Java. It is a pity that not
more attempts to gain /unbiased/ information on software productivity
are undertaken.


I agree. Unfortunately, it appears, the cost of doing that is just too
high. Several times I've wanted to do a study in my company to measure
the claimed productivity improvements of Python (and of other things) but
in the end without funding it's not likely to happen, or to be credible.

-Peter
Jul 18 '05 #11

P: n/a
"kbass" <kb***@midsouth.rr.com> writes:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.

From my our personal experience, I have been programming with Python for
about 6 months (but I have been programming in other languages for over 10
years) and I have noticed that the more I had gotten use to programming in
Python, the more my programming speed has increased. But ... this is true
with any language that you program in as long as you are learning the
methodologies and concepts of the programming language. Your thoughts.

Kevin


My experience is with in-hoiuse code, so I can't show examples, but I
can give impressions. We had Java code which did XML, matrix
manipulation, and lisp s-expression reader/writer. We converted that
to Python. It took roughly 1/3 less code (measured with wc -l), the
code was easier to read (consensus in code reviews), and it has proven
more maintainable (roughly 1/2 flowtime for similar enhancements).

The algorithms didn't change, so this wasn't just a matter of learning
the problem space better ("write one to throw away"). My impression
of the productivity improvements:

a) No compilation step. If you program in a "change one feature and
then run unittests" style, the edit-run cycle is critical. Python was
several seconds (> 15, < 60) faster per cycle. We may have had an
exceptionally slow java setup, but I've seen similar effects on other
platforms.

b) Clarity of thought. We could see the forest and not just the
trees. This led to refactoring, which simplified maintenance.

As for learning the language vs knowing others: We've now had a dozen
compsci people learn python. Each had considerable experience with
other languages. After a couple of weeks, each person says things
like "Wow, this is amazing." In code reviews, they learn better
idioms, but the basic impression of productivity is there from about 2
weeks after they start.

My biggest problem is with people who are not willing to refactor to
clean up working code. Without that step, some of the potential
improvement from python is lost. I've looked at their non-python code
and see this is apparently a personal programming trait. People who
write clean code in python also do so in VB, java, COBOL, etc. I
think python makes refactoring easy enough that people who care about
clean code are very impressed with the language.

--
ha************@boeing.com
6-6M21 BCA CompArch Design Engineering
Phone: (425) 342-0007
Jul 18 '05 #12

P: n/a
Peter Hansen <pe***@engcorp.com> writes:
Matthias wrote:

This study reports a programmer productivity gain of about 2 for all
the scripting languages over C, C++, and Java. It is a pity that not
more attempts to gain /unbiased/ information on software productivity
are undertaken.


I agree. Unfortunately, it appears, the cost of doing that is just too
high. Several times I've wanted to do a study in my company to measure
the claimed productivity improvements of Python (and of other things) but
in the end without funding it's not likely to happen, or to be credible.


I can see that it's hard to get company funding for such research.
But then: Software is a multi-billion dollar business, languages are a
really fundamental tool for it (if you get a 5% increase in
productivity by adding/removing a feature X from language Y this can
be huge savings overall). Yet nobody seems to be bothered that the
evolution of computer languages goes like: Somebody has a cute idea,
builds a language around it, tries to hype it, maybe attracts
followers, maybe creates a market which then attracts more followers.
It's all trial and error, like medicine in the middle ages.

A more scientific approach would be: Take a language X, build variants
X-with-OOP, X-with-static-typing, X-with-funny-syntax and let
developers use it under controlled settings. Watch them. Generate
bug statistics. Look for differences. Try to explain them. This
would be hard work, difficult to do and expensive. But I expect this
approach would find better [1] languages faster. The benefits might
be substantial.

Matthias

---
[1] At least better w.r.t. certain application domains and certain
types of developers.
Jul 18 '05 #13

P: n/a
Matthias <no@spam.pls> writes:

[snip]
I can see that it's hard to get company funding for such research.
But then: Software is a multi-billion dollar business, languages are a
really fundamental tool for it (if you get a 5% increase in
productivity by adding/removing a feature X from language Y this can
be huge savings overall). Yet nobody seems to be bothered that the
evolution of computer languages goes like: Somebody has a cute idea,
builds a language around it, tries to hype it, maybe attracts
followers, maybe creates a market which then attracts more followers.
It's all trial and error, like medicine in the middle ages.

A more scientific approach would be: Take a language X, build variants
X-with-OOP, X-with-static-typing, X-with-funny-syntax and let
developers use it under controlled settings. Watch them. Generate
bug statistics. Look for differences. Try to explain them. This
would be hard work, difficult to do and expensive. But I expect this
approach would find better [1] languages faster. The benefits might
be substantial.

Matthias

---
[1] At least better w.r.t. certain application domains and certain
types of developers.


The scientific approach requires usefully discriminatory hypotheses.
Language design is so art-ful that arbitrarily building languages as
you describe does not meet that criterion.

Here is another way to look at it.

Normally a science passes through phases:

a) Natural History. Wander around, get the lay of the land,
collect specimens, and try to organize what you find into mnemonically
effective schemes.

b) Field Research. Pose a hypothesis, isolate a piece of the field as
best you can, and apply your experimental factors and controls.
Observe results and interpret with a large grain of salt.

c) Lab Research. Set up isolated envieonments with significant
attention to eliminating non-experimental reasons for variation. Pose
the hypotheses. Observe results, and interpret with recognition that
a lab may be a poor model for reality.

You are asking that we jump to lab research when the field barely
sustains field research. Mostly we are still in natural history and
anecdotes.

Of course, even in the natural history phase pioneers and advance
scouts are capable of detecting an easier pass through the mountains
of comlexity. If 20 people from varied background, each of whom has
worked in several languages, tell me that Python is a really great
language, then I'll take that as a significant data point. Especially
if they are dumping their previously favorite languages (as varied as
COBOL, Perl, Java, C++, VB, Modula-3, Lisp, Prolog) to focus on
Python.
--
ha************@boeing.com
6-6M21 BCA CompArch Design Engineering
Phone: (425) 342-0007
Jul 18 '05 #14

P: n/a
Harry George wrote in a thought-provoking post:

Of course, even in the natural history phase pioneers and advance
scouts are capable of detecting an easier pass through the mountains
of comlexity. If 20 people from varied background, each of whom has
worked in several languages, tell me that Python is a really great
language, then I'll take that as a significant data point. Especially
if they are dumping their previously favorite languages (as varied as
COBOL, Perl, Java, C++, VB, Modula-3, Lisp, Prolog) to focus on
Python.


My background is (roughly in order) APL, FORTRAN, BASIC, Assembly, C,
university :-), Pascal, C++, Object Pascal, Java, LabVIEW, and Python
(with a dozen others I forget) and I'm telling you Python is a really
great language. I've also dumped my previously favourite languages
(to wit, BASIC, C, C++, Delphi, and Java) to focus on Python.

Now all you need are 19 others and we'll have a significant data point.
(Signifying what? That's what I want to know. ;-)

-Peter
Jul 18 '05 #15

P: n/a
Peter Hansen <pe***@engcorp.com> wrote in message news:<40***************@engcorp.com>...
As for "most programmers are coming from Windows background" I suspect
that too is very debatable.


Right. I don't know any german university where programming (at least
in the computer science courses) is teached on windows systems.
Jul 18 '05 #16

P: n/a
Peter Hansen <pe***@engcorp.com> wrote in message news:<40***************@engcorp.com>...
Matthias wrote:

This study reports a programmer productivity gain of about 2 for all
the scripting languages over C, C++, and Java. It is a pity that not
more attempts to gain /unbiased/ information on software productivity
are undertaken.


I agree. Unfortunately, it appears, the cost of doing that is just too
high. Several times I've wanted to do a study in my company to measure
the claimed productivity improvements of Python (and of other things) but
in the end without funding it's not likely to happen, or to be credible.


A good study of this would cost around 750000 US$ and take at least
one year with 2 or 3 persons per team. It's not a problem for IBM or
SUN to pay for it.
But they live on selling consulting hours. Why should they be
interested in a study of who to increase programmer productivity ?
Jul 18 '05 #17

P: n/a
"Peter Hansen" <pe***@engcorp.com> schreef in bericht
news:40***************@engcorp.com...
My background is (roughly in order) APL, FORTRAN, BASIC, Assembly, C,
university :-), Pascal, C++, Object Pascal, Java, LabVIEW, and Python
(with a dozen others I forget) and I'm telling you Python is a really
great language. I've also dumped my previously favourite languages
(to wit, BASIC, C, C++, Delphi, and Java) to focus on Python.

Now all you need are 19 others and we'll have a significant data point.
(Signifying what? That's what I want to know. ;-)

Fortran, Basic, Assembly many times, Pascal, C, Objective-C, Object Pascal,
C++, Java and Python.
Yes Python beats the rest wrt productivity for most of my applications :-)

Now do we need 18 more for a SIGNIFICANT data point?

regards Gerrit
--
www.extra.research.philips.com/natlab/sysarch/

Jul 18 '05 #18

P: n/a
Lothar Scholz wrote:

A good study of this would cost around 750000 US$ and take at least
one year with 2 or 3 persons per team. It's not a problem for IBM or
SUN to pay for it.
But they live on selling consulting hours. Why should they be
interested in a study of who to increase programmer productivity ?


They do much more than consult, as they sell/lease substantial amounts of
software and hardware, as well as working on many very large fixed-price
projects. For all of those, minimizing their own costs would be very
important to achieving a good margin and profitability.

Besides, they spend enormously more than US$750,000 each year on much
less important studies. Overall, they spend almost 7000 times that much
each year on R&D (according to their 2000 annual report), which means that
spending US$750,000 on a study like this would be very roughly as if you
or I were to spend $10.

For my own company, the study would be quite costly. For a much larger
company it's a drop in the bucket, and might well contribute to them
gaining a competitive advantage.

The likelihood is that they (IBM and others) already fund such studies
internally, but don't make the results public.

Possibly equally likely, however, is that any such studies are seriously
flawed and support the political position of whomever funds the project,
or whomever leads it.

Which gets us right back to square one...

-Peter
Jul 18 '05 #19

P: n/a
kbass wrote:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.


The problem is always: how do you measure/judge this?

Are you going to get the SAME PROGRAMMERS to solve the same problem
twice? If so, the second language will have a big advantage. Are you
going to get different programmers? How do you know they are the same skill?

Also: productivity for what? If your Java code is a little bit of glue
around some pre-existing EJBs then it may have an advantage. If you are
using Python and Pyrex to wrap C code, then Python will certainly have
an advantage.

Most Python programmers are speaking about their personal productivity
gain measured based on "feel". It is possible to do a more formal study
(dozens of programmers given a variety of tasks) but it would be quite
expensive. What unbiased source is going to pay for it.

Paul Prescod

Jul 18 '05 #20

P: n/a
kbass wrote:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...).


If you're interested in funding a study, I'm sure you could get someone to do a
truer test. Short of that, the evidence is mostly anectdotal because it's rare
"in the real world" to rewrite one non-trivial application in a different
language just for the heck of it. There are almost always changes in program
architecture or feature set so that it's tough to do an apples-to-apples
comparison.

That said,

http://tinyurl.com/39jsh

-Dave
Jul 18 '05 #21

P: n/a
William wrote:
kbass <kb***@midsouth.rr.com> wrote:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me some real life examples of productivity gains using Python as opposed other
programming languages.

From my our personal experience, I have been programming with Python for
about 6 months (but I have been programming in other languages for over 10
years) and I have noticed that the more I had gotten use to programming in
Python, the more my programming speed has increased. But ... this is true
with any language that you program in as long as you are learning the
methodologies and concepts of the programming language. Your thoughts.
It used to be that Python programs were shorter, faster, readable,
writable, and simply better. But, this was during the days when most
programmers had Unix background. Nowdays, most of the programmers are
coming from Windows background, and Python programs have become as
verbose and unreadable as Visual Basic or Perl.


I have a tough time taking this comment seriously - didja forget some smilies?
If not, you're basing this opinion on ______?

Is there any evidence that implies that for the same tasks the programs are now
longer, slower, less readable, less writeable, or worse? (or that any slip has
been caused by more Windows programmers?)
Ruby has not been corrupted as such. It make complicated thing less
complicated. But, it still make simply thing not as simple as Python.


???
Jul 18 '05 #22

P: n/a
Paul Prescod <pa**@prescod.net> writes:
Are you going to get the SAME PROGRAMMERS to solve the same problem
twice? If so, the second language will have a big advantage. Are you
going to get different programmers? How do you know they are the same
skill?


Maybe it doesn't matter. If you hire your programmers by running an
ad in the paper, and advertising "Python programmers wanted" gets you
better programmers than advertising "VB programmers wanted", maybe
that by itself is good enough reason to do your project in Python,
irrespective of whether Python is objectively better than VB.
Jul 18 '05 #23

P: n/a
Peter Hansen <pe***@engcorp.com> wrote in message news:<40***************@engcorp.com>...
Harry George wrote in a thought-provoking post:

Of course, even in the natural history phase pioneers and advance
scouts are capable of detecting an easier pass through the mountains
of comlexity. If 20 people from varied background, each of whom has
worked in several languages, tell me that Python is a really great
language, then I'll take that as a significant data point. Especially
if they are dumping their previously favorite languages (as varied as
COBOL, Perl, Java, C++, VB, Modula-3, Lisp, Prolog) to focus on
Python.


My background is (roughly in order) APL, FORTRAN, BASIC, Assembly, C,
university :-), Pascal, C++, Object Pascal, Java, LabVIEW, and Python
(with a dozen others I forget) and I'm telling you Python is a really
great language. I've also dumped my previously favourite languages
(to wit, BASIC, C, C++, Delphi, and Java) to focus on Python.

Now all you need are 19 others and we'll have a significant data point.
(Signifying what? That's what I want to know. ;-)

-Peter


Well here goes! I started in Prolog (coming from formal logic it was a
breeze), done Pascal, some Forth (not much), a little Smalltalk, C,
C++, Java, Javascript, HyperTalk/Supertalk,(+ otherTalks ) - VB, VBA,
AppleScript, started doing Perl (for CGI) then read about doing this
stuff in Python instead! Ported my research work from Java to Python
and never looked back (but sure you can always use another language:-)
Wow, never had so much fun since Working with Cratfman on NeXT! Thanks
GvR!

Jean-Marc
ps That's 2, 18 to go!
Jul 18 '05 #24

P: n/a
"GerritM" <gm*****@worldonline.nl> wrote:
Fortran, Basic, Assembly many times, Pascal, C, Objective-C, Object Pascal,
C++, Java and Python.
Yes Python beats the rest wrt productivity for most of my applications :-)

Now do we need 18 more for a SIGNIFICANT data point?


Fortran, C, Lisp, Elan, Gfabasic, Pure C, Borland Pascal, Borland C,
Borland C++, Visual Basic, Delphi, Python

Python is the first language out of this list for me that I can forget
about while using it, so that I can concentrate on the algorithmic
aspects of the problem.

My next language will probably one that suggests a better algorithm
after me typing in some pseudo code. Maybe using some interface to
comp.lang.xxxxxx for the interpreter?

17 to go.

Anton
Jul 18 '05 #25

P: n/a
In article <30**************************@posting.google.com >,
<be*******@aol.com> wrote:
Jul 18 '05 #26

P: n/a
co************@attbi.com (Corey Coughlin) writes:
It is a difficult problem, but I don't think it's completely
insurmountable. Take some programmers right out of school, or just a
general population of people, give them training in language X for a
fixed period of time, set them up to perform some task, and see how
long it takes them. Sure, some of them will be better programmers
than others, but with a large enough sample population you should be
able to draw some conclusions on the average, if there is an effect to
be measured. And yes, the bigger the population, the better the
results, so it would be fairly expensive to conduct, but still, you
could draw conclusions. Getting funding would be tricky, though,
that's a given.


It is common for a ComSci prof or grad student to crank up such a
study, using undergrad and grad students as the subjects. These
subjects can generally be coerced to participate ("it is required for
the course"). For "novice programmer" research, high school students
are often used. These tend to be self-selected, and ar not
representative for the general population.

So it is possible to set up such an experiment, and even to attend to
all the statistical niceties. The problem is that the experimental
model fails to match reality in other ways.

For example, real world teams have usually solved interpersonal
pecking orders and courting rituals before the coding starts. They
have domain knowledge beyond reading a (possibly fake) case study.
They have well-honed development environments, and may have existing
sets of unittests. Their requirements/directions are subject to major
changes in midstream.

These conditions are hard to duplicate in a short term academic
settings. They cannot be solved by larger sample size. That's why I
suggest that "lab research" is not ready for prime time in this field.

Some researchers have gone out in the field to use working teams.
Others retrospectively examine past projects. These have the flavor
of "field research".



Paul Prescod <pa**@prescod.net> wrote in message news:<ma*************************************@pyth on.org>...
kbass wrote:
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.


The problem is always: how do you measure/judge this?

Are you going to get the SAME PROGRAMMERS to solve the same problem
twice? If so, the second language will have a big advantage. Are you
going to get different programmers? How do you know they are the same skill?

Also: productivity for what? If your Java code is a little bit of glue
around some pre-existing EJBs then it may have an advantage. If you are
using Python and Pyrex to wrap C code, then Python will certainly have
an advantage.

Most Python programmers are speaking about their personal productivity
gain measured based on "feel". It is possible to do a more formal study
(dozens of programmers given a variety of tasks) but it would be quite
expensive. What unbiased source is going to pay for it.

Paul Prescod


--
ha************@boeing.com
6-6M21 BCA CompArch Design Engineering
Phone: (425) 342-0007
Jul 18 '05 #27

P: n/a
cl****@lairds.com (Cameron Laird) wrote in message news:<10************@corp.supernews.com>...
These are good points to raise.
Thanks for your informative reply.
Fortran's my first language. I have little opportunity nowadays to
exercise it, much as I'd like to do so. I'm certainly not as current
with it as you.
If you are willing to spend the time to learn it, a subset Fortran 95
language called F is free for Windows, Linux, and other platforms --
see http://www.fortran.com/F . A project to create a full Fortran 95
open-source compiler is well underway and may be completed this year.
When I read, "It's also clear from reading the declarations what the
function is returning ...", I take it that you have in mind such
distinctions as FLOAT vs. INT. Reasoning about types is a *frequent*
topic of discussion in comp.lang.python. I'll summarize my experience
this way: FLOAT vs. INT (and so on) takes little of my day-to-day
attention. I focus on unit tests and coding which is semantically
transparent in a more comprehensive way than just type-correctness.
Therefore, while I acknowledge the advantages you describe for Fortran,
I categorize them mostly as, "no big deal".


It's not just float vs. int. Below is a very simple illustration --
code to compute the standard deviation of a set of numbers, in Fortran
95 and Python. In the F95 code, it is clear that
(1) x(:) is a 1-d array of real's that will not be changed inside the
function (note the intent(in))
(2) the function returns a single value (F95 functions can return
arrays and structures, if they are declared as such).
(3) the function has no side-effects because it is declared PURE.

In the python code, all you know is that sd() takes one argument. It
could change that argument or some other global variable. It could
return a scalar that is real, integer, or something else. It could
return a list, a 1-D Numeric array, a 2-D Numeric array etc.

pure function sd(x) result(value)
! compute the sd of a vector
real , intent(in) :: x(:)
real :: value
integer :: n
real :: xmean
n = size(x)
value = 0.0
if (n < 2) return
xmean = sum(x)/n
value = sqrt(sum((x-xmean)**2)/(n-1.0))
end function sd

def sd(x):
""" compute the sd of a vector """
n = size(x)
if (n < 2): return -1
xmean = sum(x)/n
return sqrt(sum((x-xmean)**2)/(n-1.0))

In this case, and for other numerical work, I prefer Fortran 95 to
Python, even ignoring speed advantages and the advantage of an
executable over a script. F95 can look a lot like Python -- no curly
braces or semicolons, and a lack of C/C++ trickery in general.

Of course, Python and other scripting languages were not primarily
designed for numerical work. Numeric Python is powerful and elegant.

Another point. Python advocates often claim it is better than compiled
languages because the code is much shorter. They rebut worries about
the loss of safety by recommending unit testing. In a Fortran 95 or
C++ program, I don't think as many tests need to be written, because
the compiler catches many more things. If you include the amount of
testing code when counting the amount of code needed, I suspect
Python's brevity advantage will partly disappear. Also, I regard unit
tests to check what happens when a Python function is called with
invalid arguments (int instead of float, scalar instead of array) as
low-level, tedious work that I would rather delegate to a compiler.
Scripting advocates claim that their languages are higher level than
compiled languages, but in this case the reverse is true -- the
compiler does more work for you than the interpreter does.
Jul 18 '05 #28

P: n/a
beliavsky wrote:
Another point. Python advocates often claim it is better than compiled
languages because the code is much shorter. They rebut worries about
the loss of safety by recommending unit testing. In a Fortran 95 or
C++ program, I don't think as many tests need to be written, because
the compiler catches many more things. If you include the amount of
testing code when counting the amount of code needed, I suspect
Python's brevity advantage will partly disappear. Also, I regard unit
tests to check what happens when a Python function is called with
invalid arguments (int instead of float, scalar instead of array) as
low-level, tedious work that I would rather delegate to a compiler.
Scripting advocates claim that their languages are higher level than
compiled languages, but in this case the reverse is true -- the
compiler does more work for you than the interpreter does.


Interesting thread! One minor nit though: writing the type of low-level tests
you describe above is almost always a no-no for Python programs. Good tests
tend to be geared towards functionality, so that the number and type of tests
correlates more closely to the features and algorithmic complexity of the code
than it does to the language.

Those ultra tedious low-level tests are usually a waste of time because they
end up covering cases that either never happen in practice or cases that are
already covered by tests that cover real functionality. In fact, really the
only time where I've come across the need for those types of tests is in
_other_ languages (C++) because they are more common in languages that require
the programmer to manage more details, and because failing to test those
conditions results in a hard crash of the program (tests like "properly rejects
null pointers passed in" and "doesn't access beyond array bounds").

IOW, in the *worst* case you invest the same amount of time and effort testing
your Python application as you spend testing your C++ (or whatever)
application - I haven't come across ANY case in practice where you end up
investing more effort; in fact I'd say that in our projects we end up spending
considerably less effort because (1) we get to skip precisely the type of tests
you're talking about and (2) the setup/teardown/test code is much more consise
and reusable (which is why in the past we've used Python as the test language
for applications written in other languages).

In theory, yes, you could have a Python function that gets passed a scalar when
it was expecting an array, but in order for that to occur you almost always
have a gaping hole in your suite of higher-level _feature_ tests. With adequate
feature and system test coverage (which you'll need regardless of the
implementation language), the probability of encountering a scalar vs array
error is way less likely than e.g. a null pointer access bug.

-Dave
Jul 18 '05 #29

P: n/a

"Harry George" <ha************@boeing.com> wrote in message
news:xq*************@cola2.ca.boeing.com...
It is common for a ComSci prof or grad student to crank up such a
study, using undergrad and grad students as the subjects. These
subjects can generally be coerced to participate ("it is required for
the course"). For "novice programmer" research, high school students
are often used. These tend to be self-selected, and ar not
representative for the general population.


The key feature that makes a study/experiment statistically analyzable is
randomization of subjects to treatments. So, to compare two languages
(simplest case), you have everyone write programs in both languages, but
randomize the order (half one way, the other half the other way) or you
have one half do language A and the other half B, again randomizing the
assigment. Also, if there is any subjectivity in the evaluation of
results, then the judges should not know the language when judging the
output.

Do the studies you speak of meet these criteria?

Terry J. Reedy


Jul 18 '05 #30

P: n/a
Harry George <ha************@boeing.com> writes:
It is common for a ComSci prof or grad student to crank up such a
study, using undergrad and grad students as the subjects. These
subjects can generally be coerced to participate ("it is required for
the course"). For "novice programmer" research, high school students
are often used. These tend to be self-selected, and ar not
representative for the general population.


Are these studies published? I see the limitations, but still would
be interested in the results. So far, I found only the papers of Lutz
Prechtel et al. on the web.

Matthias
Jul 18 '05 #31

P: n/a
Matthias <no@spam.pls> writes:
Harry George <ha************@boeing.com> writes:
It is common for a ComSci prof or grad student to crank up such a
study, using undergrad and grad students as the subjects. These
subjects can generally be coerced to participate ("it is required for
the course"). For "novice programmer" research, high school students
are often used. These tend to be self-selected, and ar not
representative for the general population.


Are these studies published? I see the limitations, but still would
be interested in the results. So far, I found only the papers of Lutz
Prechtel et al. on the web.

Matthias


I was talking about "programmer productivity" studies in general, not
specifically Python. My point was that the experimental model is
convenient but not a very good representation of reality.

Google for "programmer productivity"

--
ha************@boeing.com
6-6M21 BCA CompArch Design Engineering
Phone: (425) 342-0007
Jul 18 '05 #32

P: n/a
"Terry Reedy" <tj*****@udel.edu> writes:
"Harry George" <ha************@boeing.com> wrote in message
news:xq*************@cola2.ca.boeing.com...
It is common for a ComSci prof or grad student to crank up such a
study, using undergrad and grad students as the subjects. These
subjects can generally be coerced to participate ("it is required for
the course"). For "novice programmer" research, high school students
are often used. These tend to be self-selected, and ar not
representative for the general population.
The key feature that makes a study/experiment statistically analyzable is
randomization of subjects to treatments. So, to compare two languages
(simplest case), you have everyone write programs in both languages, but
randomize the order (half one way, the other half the other way) or you
have one half do language A and the other half B, again randomizing the
assigment. Also, if there is any subjectivity in the evaluation of
results, then the judges should not know the language when judging the
output.

Do the studies you speak of meet these criteria?

Terry J. Reedy


The key is to use a valid experimental model in the first place. All
the randomizing, DOE, double blind, ANOVA, discriminant analysis, factor
analysis, etc in the world doesn't help if the model is a poor
rendition of the intended target.

I'm not in this field myself, and only know of it from scanning the
literature on language selection some time ago. My impression was
that the academic papers using students as subjects did an honest job
of designing the experiment and probably got honest results for their
experimental population. And they usually discussed their concerns
that the model was not very representative of industry practice.



--
ha************@boeing.com
6-6M21 BCA CompArch Design Engineering
Phone: (425) 342-0007
Jul 18 '05 #33

P: n/a
Harry George <ha************@boeing.com> writes:
I was talking about "programmer productivity" studies in general, not
specifically Python. My point was that the experimental model is
convenient but not a very good representation of reality.
When people started to approach medicine scientifically I'm sure they
also were overwhelmed by the complexities of their endeavor at first.

I don't question that reality must be strongly simplified in order to
do experiments. But you always have to simplify in order to
comprehend, explain, communicate. The question is whether
simplifications are made explicitly and consciously or not.
Google for "programmer productivity"


I checked the first 30 out of the 16,000 pages found. It is about
what I expected: Mostly advertisement for products and "solutions".
Maybe one of the pages might qualify as an empirical study.

This industry is on a random walk... ;-))
Jul 18 '05 #34

P: n/a
"kbass" <kb***@midsouth.rr.com> wrote in message news:<La*****************@fe3.columbus.rr.com>...
In different articles that I have read, persons have constantly eluded to
the productivity gains of Python. One person stated that Python's
productivity gain was 5 to 10 times over Java in some in some cases. The
strange thing that I have noticed is that there were no examples of this
productivity gain (i.e., projects, programs, etc.,...). Can someone give me
some real life examples of productivity gains using Python as opposed other
programming languages.


I don't think a tenfold programmer productivity increase over Java
is typical, but there are certainly examples of significant
productivity gains in converting to Python from some other language.

See here for instance:
http://www.thinkware.se/cgi-bin/thinki.cgi/PythonQuotes

E.g.

"I was amazed by the amount [of] flexibility and self-awareness that
Python had. When a 20,000 line project went to approximately 3,000
lines overnight, and came out being more flexible and robust once it
had been completed, I realized I was on to something really good."

--Glyph Lefkowitz (Developer of the Twisted network server
framework)

This 6-7-fold improvement was in going from C++ I think.

"...However, it did provide a hard measurement on the benefits of
using Python instead of C++: the lines of Python code was 10% of the
equivalent C++ code. ... From a software engineering standpoint, this
was a tremendous success. Bug counts are always proportional to the
number of lines of code, meaning that the Python version should have
10% of the bugs of the C++ version. Further, the fewer lines of code
meant that it would have a smaller and more understandable "footprint"
in the developers' minds. The Python code was arguably more
maintainable due to its improved readability and rapid edit-test cycle
(no compile and link step). Lastly, the server could also be shown to
be more robust - being entirely in Python, it was not subject to
memory-related coding errors such as null pointers, buffer
misallocation and overruns, or unfreed or doubly-freed memory..."

--Greg Stein, eShop (which was later sold to Microsoft)

Here you have a 10-fold gain, going from C++.

Another aspect of any X-fold programmer productivity improvement is
that there is a lot more than just programming going on in a project.
If requirements capture, analysis, design, testing, documentation,
planning etc takes the same amount of time, you will still not be
able to influence the total project cost a lot.

On the other hand, using a tool like Python doesn't just influence
the programmers, but the whole project! If prototyping and coding
in general becomes significantly faster, the trade off for how much
analysis and design you should do will change. Why spend weeks at
a conference table arguing about different design alternatives if
the programmers can supply several different implementations within
a day?

The sooner a prototype can be put in the hands of the end users, the
faster mistakes in the requirements gathering will be sorted out,
and new needs will be discovered and can be weighed in before it's
too late.

The ability to play interactively with Python objects and to develop
really rapidly means that the roundtrip from end user request, to
a new prototype for her to try out, can be reduced from hours to
minutes or from days to hours. There is no reason to even leave the
end users computer to add and demonstrate a new or changed feature.
All we need is there...

Testing, deployment, data conversion etc are also parts of software
development projects that can gain a lot from having a tool like
Python available.

Other examples of productivity boosts with python can be found here:
http://pythonology.org/success

Few mention numbers though. I guess that there are some organisations
that use the Capability Maturity Model for Software who would be able
to find useful metrics if they used Python for a project similar to
one where they had previously used Java, but a) few use CMM with such
rigor, and b) if they did, they might not want to tell! Let the
competition continue to waste their time coding Java. :) Finally, c)
organisations where CMM is popular are probably organisations where
static typing, waterfall development style and other rigid and archaic
ideas are more popular than agile methods and tools.

Still benchmarks always have a limited value.

I've looked a bit at benchmarks, such as The Great Computer Language
Shootout and looking at lines of code there, gives much smaller
differences than five to ten times. I compared C++ and Python, and C++
varied from 25% less to 500% more lines of code, on the average C++
programs were around 80% longer. Less than a twofold gain it seems...

But when we study the material in more detail, we see some relevant
things:

The more "realistic" the benchmarks are, the bigger the difference:
For plain algorithm tests and things like "nested loops", "call a
method", "instanciate an object" etc, there is almost no difference.
For things like "echo client/server", "spell check", file handling
etc, the difference is between 2.4 and 5.1 times.

I don't know how strong the Java standard library is, but several of
the benchmarks are about reimplementing builtin things in Python, such
as sorting and random number generation. Completely meaningless! A
real life implementation would be a much shorter, since most of the
needed code is already in a standard library module!

For fun, I've made Python programs that achieved the same end result
without trying to use the same (meaningless) methods as the other
programs in the benchmark, and they are often 10 times shorter.
Sometimes they are also much faster and scalable.

There are three big reasons that Python programs are typically short
and easy to read.
* The Python syntax and data types are at a higher level of
abstraction, and don't have a lot of noise. It's also designed
with the objective of making it easy to do the right thing, rather
than making it difficult to do the wrong thing.
* The dynamic nature of Python makes it easy to write very flexible
code, and avoid a lot of code redundancy and twisting that is
common as you have to fight against the limitations of more static
and low level languages.
* Python's standard library is rich and reasonably easy to use.

In addition to that, the absence of compile and link steps in Python
makes development much faster than languages like C++. I've worked
in large C++ projects where building major applications could take
an hour due to the extensive dependencies. In Python this is a non-
issue. I don't know enough about Java development to compare with
that.

Since Python programs are shorter and easier to understand, and
faster to write than programs written in most other languages, it's
usually viable to change or rewrite code which is slow or otherwise
non-optimal, while it's common in projects using other languages
that code which is known to be bad is kept because it's too costly
to fix it.

Anyway, if you try Python in some project, I geuss your will form
an educated opinion. I'm sure there is no one objective truth about
this. Different people have different preferences, and different
languages have different sweetspots. If you are writing drivers for
hardware, or encryption code, Python is probably not main language,
but it might still be very useful for various tools and one shot
hacks that you do while you develop the "real" software in some
other language.

Some people think Ruby is "purer" and "prettier" than Python.
Personally, I find the Perl-like features, such as various #@$%
etc and mixing regular expressions in the syntax of the language
rather awkward.
Jul 18 '05 #35

P: n/a
Matthias wrote:
...

I checked the first 30 out of the 16,000 pages found. It is about
what I expected: Mostly advertisement for products and "solutions".
Maybe one of the pages might qualify as an empirical study.

This industry is on a random walk... ;-))


Despite the smiley I think it is worth putting this in perspective. Take
a random language implementation from 1975 and try implementing
something in it. I think that you will find strong anecdotal evidence
that Things Are Getting Better. It is a frustratingly slow and
inefficient but one that nevertheless seems to move in the right direction.

Paul Prescod

Jul 18 '05 #36

P: n/a
Peter Hansen <pe***@engcorp.com> wrote:
Harry George wrote in a thought-provoking post:

Of course, even in the natural history phase pioneers and advance
scouts are capable of detecting an easier pass through the mountains
of comlexity. If 20 people from varied background, each of whom has
worked in several languages, tell me that Python is a really great
language, then I'll take that as a significant data point. Especially
if they are dumping their previously favorite languages (as varied as
COBOL, Perl, Java, C++, VB, Modula-3, Lisp, Prolog) to focus on
Python.


My background is (roughly in order) APL, FORTRAN, BASIC, Assembly, C,
university :-), Pascal, C++, Object Pascal, Java, LabVIEW, and Python
(with a dozen others I forget) and I'm telling you Python is a really
great language. I've also dumped my previously favourite languages
(to wit, BASIC, C, C++, Delphi, and Java) to focus on Python.

Now all you need are 19 others and we'll have a significant data point.
(Signifying what? That's what I want to know. ;-)


I would say "Signifying nothing", but that would mean that all this is
"a tale told by an idiot, full of sound and fury," if you believe the
Bard. I don't feel furious, and I don't think I'm an idiot, so never
mind. (Although if you have other opinions on the latter subject, feel
free *not* to let me know.) <very big grin>

Anyway -- I started with BASIC at age six. Learned Pascal and C on my
own. Also learned C++, but never really grokked object-oriented
programming until much later in my programmer's development. Next came
assembler, which on register-starved Intel hardware was way more
complicated than it really needed to be. In college I was exposed to
Java and Lisp, but never did much with them beyond that one class,
because I had discovered Perl! Here was a language with arrays that
automatically re-sized themselves to fit the amount of data you put in
them -- bliss! And built-in hash table data types were pretty cool too;
they made several different algorithms a lot easier to code up. I also
discovered PHP, which I also thought was very cool.

Then I got out of college and started using these languages every day,
for a living. I still liked Perl, but it was beginning to get a bit hard
for me to read my own code six months later. Not that I was writing
"write-only" code, but the multiplicity of $ and @ symbols mixed in with
my code was actually distracting me from the code's meaning. I found I
was having to devote part of my brainpower to parsing the syntax, and
that was slowing me down. And then a co-worked introduced me to Python.
I was weirded out by the "indentation thing" at first, but quickly
learned to like not having to look for braces. And Python just "felt"
clean. I can't explain it very well, but Python's syntax just never got
in the way of my reading, which left me free to concentrate all my
attention on what the code was actually doing. I dumped my previously
favorite language, Perl, in favor of Python and haven't looked back
since.

--
Robin Munn
rm***@pobox.com
Jul 18 '05 #37

P: n/a
Robin Munn <rm***@pobox.com> writes:
Peter Hansen <pe***@engcorp.com> wrote:
Now all you need are 19 others and we'll have a significant data point.
(Signifying what? That's what I want to know. ;-)


I would say "Signifying nothing", but that would mean that all this is
"a tale told by an idiot, full of sound and fury," if you believe the
Bard. I don't feel furious, and I don't think I'm an idiot, so never
mind. (Although if you have other opinions on the latter subject, feel
free *not* to let me know.) <very big grin>

Anyway -- I started with BASIC at age six. Learned Pascal and C on my
own. Also learned C++, but never really grokked object-oriented
[...]


Just in case you guys are interested: A similar thing (people telling
personal stories how they got to language X after having suffered
under languages A, B, and Y) is going on in the Lisp community for a
while. The project page is http://alu.cliki.net/RtL%20Highlight%20Film

I admit that even if it's not science it's an interesting read.
Jul 18 '05 #38

P: n/a
Paul Prescod <pa**@prescod.net> writes:
Matthias wrote:
This industry is on a random walk... ;-))


Despite the smiley I think it is worth putting this in
perspective. Take a random language implementation from 1975 and try
implementing something in it. I think that you will find strong
anecdotal evidence that Things Are Getting Better. It is a
frustratingly slow and inefficient but one that nevertheless seems to
move in the right direction.


In 1975 I would have chosen Scheme to implement something. Had people
in 1975 added indentation-based syntax and some useful-and-documented
libs to it... ;-)

But I agree that I wouldn't want to switch my hardware/software
environment against one from 25 years ago. Or 3 for that matter.
Jul 18 '05 #39

This discussion thread is closed

Replies have been disabled for this discussion.