473,287 Members | 2,263 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,287 software developers and data experts.

What is different with Python ?

I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .
Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.

Having followed this newsgroup for sometimes, I now have the gut feeling
(see 3)) other people have that feeling too.
Quid ?

Regards,

Philippe




Jul 19 '05 #1
137 6877
Philippe C. Martin wrote:
I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .
Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.


Unfortunately, you didn't give many examples of what you did for the
last 18 years (except that that also included RT kernels).

So let me guess two aspects:

1. In these 18 years, you got acquainted to a variety of concepts
in various languages. When dealing with Python, you could easily
correlate between Python concepts and the ones you are familiar
with. This is one of Python's strenghts: it tries not to be
surprising, but builds on what most people consider standard.
Try "import this" some time; you may be experiencing the Zen:

Readability counts.
...
Special cases aren't special enough to break the rules.
Although practicality beats purity.
...
In the face of ambiguity, refuse the temptation to guess.

2. You may not have dealt with a weakly-typed language before. If
that is the case, your feeling of "something being different"
most likely comes from that difference.

Regards,
Martin
Jul 19 '05 #2
Philippe C. Martin wrote:
I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .

Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.
I cannot understand this sentence. What questions? Which language?

Do you mean that, currently, when you need to solve a problem, you
usually use Python even though you are relatively new to it? And that
before learning Python you usually used a variety of languages, none
dominating the others?
As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.


Python is my language of choice because it doesn't get in the way. I
don't have to contort my problem into strict class heirarchies or
recursive functions. I don't have to construct the whole system to test
just a part of it. The interactive prompt has become vital to my
workflow. By and large, I just Get It Done.

The "one and preferably only one obvious way to do it" principle and
Python's emphasis on readability means that I gain knowledge and
capability as I write code. When I need to do a similar task six months
later, I don't have to spend an inordinate amount of time figuring out
what the hell I was thinking back then. In the same vein, I can also
read and learn from others' code much more than I could from, say, Perl.

--
Robert Kern
rk***@ucsd.edu

"In the fields of hell where the grass grows high
Are the graves of dreams allowed to die."
-- Richard Harter

Jul 19 '05 #3
Martin v. Löwis wrote:
Philippe C. Martin wrote:
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .


2. You may not have dealt with a weakly-typed language before. If
that is the case, your feeling of "something being different"
most likely comes from that difference.


If he's done realtime kernels, he's most definitely worked with a weakly
typed language before (assembly most likely), but I think you meant to
say (or should have said) "dynamically typed".

-Peter
Jul 19 '05 #4
Re: What is different with Python ?

from my point of view, to be honest, nothing
except mixing a well spiced soup of what
was available within other programming
languages.

I think, that what currently makes a real
difference is not the language as such,
but the people using it, posting here and
writing new modules for it.

I can imagine, that with becoming more
popular and less supported by the
core development team (following a
special kind of programming philosophy
called "Pythonic way" of approaching
things) this can change, so I can only hope,
that this won't happen.

Don't ask _me_ what "Pythonic way" is -
I think, you have to feel it yourself in order to
understand it (I haven't yet seen any definition
of it different from what is already also known
from other programming languages, but maybe
someone can provide it?).

by the way: I see a contradiction between
- 1) I do not consider my intelligence/education above average and - 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) . because - 5) is the story many of programmers choosing Python
as a tool or programming language of their choice went through
and because of the fact you are here asking that question.

Claudio

"Philippe C. Martin" <ph******@philippecmartin.com> schrieb im Newsbeitrag
news:bX*****************@newssvr30.news.prodigy.co m... I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3) when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .
Yet for the first time I get (most) of my questions answered by a language I did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and find out why Python seems different.

Having followed this newsgroup for sometimes, I now have the gut feeling
(see 3)) other people have that feeling too.
Quid ?

Regards,

Philippe



Jul 19 '05 #5
Thanks ,
I have gotten many answers already, some not posted.

1) Typing is not the issue - even with RT-Kernels, people use C++
2) Yes I find dynamic binding very nice
3) "... you didn't give many examples of what you did for the
last 18 years (except that that also included RT kernels). ...." assembly
(losts) , basic, cobol, lisp, JAVA, c, c++, perl, Tcl, Java, JavaCard .....

I know the "interactive" aspect helps also, the runtime error/exception
checking, the many libraries/tools, the responsiveness of the people on
this newsgroup, the "introspectiveness" of the system, the cross-platform
it deals with, the way it "pushes" people to code in a clean way, the GUI
support, the stability, the extensibility (in and out) .... I'm sure you'll
agree none of that can explain why after 1 week of playing with, I was more
productive in Python than C/C++ just as I know my product (I will not
describe it here as I am not marketing) would not exist today were it not
for Python.
4) Yes I agree a mix ("... well spiced soup ...") seems to be the answer but
my brain somehow wants to formalize it.

Regards,

Philippe

Philippe C. Martin wrote:
I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain
3) when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .
Yet for the first time I get (most) of my questions answered by a language
I did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and
find out why Python seems different.

Having followed this newsgroup for sometimes, I now have the gut feeling
(see 3)) other people have that feeling too.
Quid ?

Regards,

Philippe


Jul 19 '05 #6
> 4) Yes I agree a mix ("... well spiced soup ...")
seems to be the answer but
my brain somehow wants to formalize it.
Here one further suggestion trying to point out, that
it probably can't generally be formalized, because
the experience one developes after going through
the story of "assembly, basic, cobol, lisp,
JAVA, c, c++, perl, Tcl, Java, JavaCard" has
in my opinion a vital impact on shortcuts one uses
and the way of doing things. I mean, that the concept
of Python has raised from such experience, so anyone
who went through all this, will get the core ideas
implemented in Python without any effort, because
they were already there as a kind of meta-language
used in thinking, unconsciously looking for the
chance of beeing expressed in formalized form
as a new programming language.
To support my thesis I can mention here, that
from my experience, Python seems not to be
the language of choice for the very beginners,
who prefere another approaches which are
mostly variants of Basic.

Claudio

"Philippe C. Martin" <ph******@philippecmartin.com> schrieb im Newsbeitrag
news:Gh****************@newssvr11.news.prodigy.com ... Thanks ,
I have gotten many answers already, some not posted.

1) Typing is not the issue - even with RT-Kernels, people use C++
2) Yes I find dynamic binding very nice
3) "... you didn't give many examples of what you did for the
last 18 years (except that that also included RT kernels). ...." assembly
(losts) , basic, cobol, lisp, JAVA, c, c++, perl, Tcl, Java, JavaCard ......
I know the "interactive" aspect helps also, the runtime error/exception
checking, the many libraries/tools, the responsiveness of the people on
this newsgroup, the "introspectiveness" of the system, the cross-platform
it deals with, the way it "pushes" people to code in a clean way, the GUI
support, the stability, the extensibility (in and out) .... I'm sure you'll agree none of that can explain why after 1 week of playing with, I was more productive in Python than C/C++ just as I know my product (I will not
describe it here as I am not marketing) would not exist today were it not
for Python.
4) Yes I agree a mix ("... well spiced soup ...") seems to be the answer but my brain somehow wants to formalize it.

Regards,

Philippe

Philippe C. Martin wrote:
I apologize in advance for launching this post but I might get enlightment somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain
3) when it comes true.
- 5) I have developed for many years (>18) in many different environments, languages, and O/S's (including realtime kernels) .
Yet for the first time I get (most) of my questions answered by a language I did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and
find out why Python seems different.

Having followed this newsgroup for sometimes, I now have the gut feeling
(see 3)) other people have that feeling too.
Quid ?

Regards,

Philippe



Jul 19 '05 #7
I agree '...choice for the very beginners ...': a hundred year ago I was a
Pascal TA, and although I like the language, I find/found people stuggled
as much with the language as with the algorithm they were supposed to
implement.

"...mostly variants of Basic..." What I truly liked going from Basic (which
has greatly evolved) to Pascal was the fact I found a definite risk not
having to declare variable/ or rather I understood the lack of danger in
doing so: The one (so I thought) glitch with Python that almost made me
stop playing with was that very fact. yet I agree a complete beginner would
simply the approach most meaningful "why should I write int i = 1 since I
know 1 is an int". Since the "dangers" of old basic are gone from Python
(can't do i=y if y has not ever been initialized). I must agree with that
too. I'm actually pushing the few CS professors I know to use Python for CS
101. Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.

Regards,

Philippe


Claudio Grondi wrote:
4) Yes I agree a mix ("... well spiced soup ...")
seems to be the answer but
my brain somehow wants to formalize it.


Here one further suggestion trying to point out, that
it probably can't generally be formalized, because
the experience one developes after going through
the story of "assembly, basic, cobol, lisp,
JAVA, c, c++, perl, Tcl, Java, JavaCard" has
in my opinion a vital impact on shortcuts one uses
and the way of doing things. I mean, that the concept
of Python has raised from such experience, so anyone
who went through all this, will get the core ideas
implemented in Python without any effort, because
they were already there as a kind of meta-language
used in thinking, unconsciously looking for the
chance of beeing expressed in formalized form
as a new programming language.
To support my thesis I can mention here, that
from my experience, Python seems not to be
the language of choice for the very beginners,
who prefere another approaches which are
mostly variants of Basic.

Claudio

"Philippe C. Martin" <ph******@philippecmartin.com> schrieb im Newsbeitrag
news:Gh****************@newssvr11.news.prodigy.com ...
Thanks ,
I have gotten many answers already, some not posted.

1) Typing is not the issue - even with RT-Kernels, people use C++
2) Yes I find dynamic binding very nice
3) "... you didn't give many examples of what you did for the
last 18 years (except that that also included RT kernels). ...." assembly
(losts) , basic, cobol, lisp, JAVA, c, c++, perl, Tcl, Java, JavaCard

.....

I know the "interactive" aspect helps also, the runtime error/exception
checking, the many libraries/tools, the responsiveness of the people on
this newsgroup, the "introspectiveness" of the system, the cross-platform
it deals with, the way it "pushes" people to code in a clean way, the GUI
support, the stability, the extensibility (in and out) .... I'm sure

you'll
agree none of that can explain why after 1 week of playing with, I was

more
productive in Python than C/C++ just as I know my product (I will not
describe it here as I am not marketing) would not exist today were it not
for Python.
4) Yes I agree a mix ("... well spiced soup ...") seems to be the answer

but
my brain somehow wants to formalize it.

Regards,

Philippe

Philippe C. Martin wrote:
> I apologize in advance for launching this post but I might get enlightment > somehow (PS: I am _very_ agnostic ;-).
>
> - 1) I do not consider my intelligence/education above average
> - 2) I am very pragmatic
> - 3) I usually move forward when I get the gut feeling I am correct
> - 4) Most likely because of 1), I usually do not manage to fully
> explain 3) when it comes true.
> - 5) I have developed for many years (>18) in many different environments, > languages, and O/S's (including realtime kernels) .
>
>
> Yet for the first time I get (most) of my questions answered by a language > I did not know 1 year ago.
>
> As I do try to understand concepts when I'm able to, I wish to try and
> find out why Python seems different.
>
> Having followed this newsgroup for sometimes, I now have the gut
> feeling (see 3)) other people have that feeling too.
>
>
> Quid ?
>
> Regards,
>
> Philippe


Jul 19 '05 #8
On Sat, 11 Jun 2005, Philippe C. Martin wrote:
Yet for the first time I get (most) of my questions answered by a
language I did not know 1 year ago.


Amazing, isn't it? Rest assured that you're not alone in feeling this way.
I don't know quite why, but python is just makes writing programs
immensely easier than any other language i've ever used; i think it's the
very minimal amount of boilerplate it requires, the clean and powerful set
of builtin types and functions and, for me, the higher-order functions. I
can do in a few lines of python what would have taken me pages and pages
of java.

tom

PS: http://jove.prohosting.com/~zahlman/cpp.html

--
Jim-Jammity Jesus Krispy Kreme Christ on a twat-rocket!
Jul 19 '05 #9
> PS: http://jove.prohosting.com/~zahlman/cpp.html

So you're saying they only use perl in Taiwan ;-)

Tom Anderson wrote:
On Sat, 11 Jun 2005, Philippe C. Martin wrote:
Yet for the first time I get (most) of my questions answered by a
language I did not know 1 year ago.


Amazing, isn't it? Rest assured that you're not alone in feeling this way.
I don't know quite why, but python is just makes writing programs
immensely easier than any other language i've ever used; i think it's the
very minimal amount of boilerplate it requires, the clean and powerful set
of builtin types and functions and, for me, the higher-order functions. I
can do in a few lines of python what would have taken me pages and pages
of java.

tom

PS: http://jove.prohosting.com/~zahlman/cpp.html


Jul 19 '05 #10
Philippe C. Martin wrote:
too. I'm actually pushing the few CS professors I know to use Python for CS
101. Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.


I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and tribulations
of memory management (or those other things that Python hides so well).

Simple concepts like variables, control structures, input and output are
more than enough to start with. In fact, I suspect any course that
attempts to teach with a language that requires things like manual
memory management will be failing to provide an effective grounding in
computer science because of all the noise. Seeing the forest for the
trees and all that...

-Peter
Jul 19 '05 #11
"Philippe C. Martin" <ph******@philippecmartin.com> wrote:
Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.


I know I'm going out on a limb by asking this, but why do you think future
software engineers should know about memory management?

I used to worry about register allocation. Today, I don't even know how
many registers any machine I work on has. I used to worry about word size,
and byte order. I used to worry about whether stacks grew up or down and
addressing modes and floating point formats. Sure, somebody's got to worry
about those things, but most people who write software can be blissfully
ignorant (or, at best, dimly aware) of these issues because somebody else
(compiler writer, hardware designer, operating system writer, etc) has
already done the worrying.

There used to be a time when you had to worry about how many tracks to
allocate when you created a disk file. When's the last time you worried
about that?
Jul 19 '05 #12
Roy Smith wrote:
"Philippe C. Martin" <ph******@philippecmartin.com> wrote:
Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.

I know I'm going out on a limb by asking this, but why do you think future
software engineers should know about memory management?


Perhaps we have a terminology problem here i.e. different meanings of
"software engineer". Philippe started talking about "CS" courses,
whereas you may be referring to people who have done an "IT" course or
achieved a certification in the use of app development tool X.

I used to worry about register allocation. Today, I don't even know how
many registers any machine I work on has. I used to worry about word size,
and byte order. I used to worry about whether stacks grew up or down and
addressing modes and floating point formats. Sure, somebody's got to worry
about those things, but most people who write software can be blissfully
ignorant (or, at best, dimly aware) of these issues because somebody else
(compiler writer, hardware designer, operating system writer, etc) has
already done the worrying.
You would hope they'd done more than worry about it. However sometimes
one's fondest hopes are dashed. You must have noticed the anguish in the
timbot's posts that mention Windows 95 memory management.

There used to be a time when you had to worry about how many tracks to
allocate when you created a disk file. When's the last time you worried
about that?


Seeing you asked: early 1970s, on an IBM 1800. But much more recently it
certainly helped if one were slightly more than dimly aware of the
difference between a FAT filesystem and an NTFS filesystem :-)

Cheers,
John
Jul 19 '05 #13
John Machin <sj******@lexicon.net> writes:
Roy Smith wrote:
"Philippe C. Martin" <ph******@philippecmartin.com> wrote:
Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.

I know I'm going out on a limb by asking this, but why do you think
future software engineers should know about memory management?

Perhaps we have a terminology problem here i.e. different meanings of
"software engineer". Philippe started talking about "CS" courses,
whereas you may be referring to people who have done an "IT" course or
achieved a certification in the use of app development tool X.


While I agree with John - software engineers should know something
about memory managment - I sort of agree with Roy as well, in that,
like Peter, I think memory management is something that doesn't need
to be taught immediately. A modern programming environment should take
care of the details, but a software engineer will be cognizant of the
details, and know enough to know when they have to worry about it and
when they can safely ignore it.
I used to worry about register allocation. Today, I don't even know
how many registers any machine I work on has. I used to worry about
word size, and byte order. I used to worry about whether stacks
grew up or down and addressing modes and floating point formats.
Sure, somebody's got to worry about those things, but most people
who write software can be blissfully ignorant (or, at best, dimly
aware) of these issues because somebody else (compiler writer,
hardware designer, operating system writer, etc) has already done
the worrying.

You would hope they'd done more than worry about it. However sometimes
one's fondest hopes are dashed. You must have noticed the anguish in
the timbot's posts that mention Windows 95 memory management.


I think most of those things are indeed things that your average
software engineer can ignore 90+% of the time. What makes someone a
software engineer is that they know about those details, and know how
they will affect the code they are writing - and hence when they have
to worry about those details.

Oddly enough, I find similar comments apply to a lot of the data
structures I learned in school. I recently applied for a job that had
a series of essay questions in the application. They had a series of
problems with requests for solutions, and my immediate reaction to
each was to reach for off-the-shelf software to solve the
problem. While they wanted - and I provided - a discussion of data
structures and big-O running time for various operations, all the
things they wanted to do were essentially solved problems, and there
was debugged and tuned code available to deal with things - and it's
much faster to not write software if you can to solve the problem.

For instance, one problem was "You have two files that have lists of 1
billion names in them. Print out a list of the names that only occur
in one of the files."

That's a one-line shell script: "comm -12 <(sort file_one) <(sort file_two)"

I gave them that answer. I also gave them a pseudo-code solution, but
frankly, in real life, I'd install the shell script and get on with
things. If I were hiring someone, I'd hire the person who gave me the
shell script. Rather than spending hours/days debugging a program to
solve the problem, I get a solution in minutes. If it runs into
problems, *then* it's time to start hand coding the solution.
There used to be a time when you had to worry about how many tracks
to allocate when you created a disk file. When's the last time you
worried about that?

Seeing you asked: early 1970s, on an IBM 1800. But much more recently
it certainly helped if one were slightly more than dimly aware of the
difference between a FAT filesystem and an NTFS filesystem :-)


For me it was the late 1970s, on an IBM 3081. But I was worried about
disk sector sizes well into the 1990s. Since then I've worked on
systems that didn't have a file system as such; it had a database of
databases, and you queried the former to find the latter.

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Jul 19 '05 #14
I guess because I have mostly worked with embedded systems and that,
although I have always tried to put abstraction layers between my
applications and the hardware, some constraints still remain at the
application level: (memory, determinism, re-entrance,...). You will notice
that 99% of the embedded systems with realtime constaints use assembly,
C/C++, or ADA.

I agree with you and Peter though that these issues need not be treated on a
first course. Yet, and depending on the ultimate goal (John spoke of IT
versus CS) some of the newly trained folks should know about it. We could
not enjoy Python if no one were here to implement its VM, I have not looked
at the code, but I gather it is fairly complex and does require an amount
of "low level" skills.

Regards,

Philippe

Roy Smith wrote:
"Philippe C. Martin" <ph******@philippecmartin.com> wrote:
Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.


I know I'm going out on a limb by asking this, but why do you think future
software engineers should know about memory management?

I used to worry about register allocation. Today, I don't even know how
many registers any machine I work on has. I used to worry about word
size,
and byte order. I used to worry about whether stacks grew up or down and
addressing modes and floating point formats. Sure, somebody's got to
worry about those things, but most people who write software can be
blissfully ignorant (or, at best, dimly aware) of these issues because
somebody else (compiler writer, hardware designer, operating system
writer, etc) has already done the worrying.

There used to be a time when you had to worry about how many tracks to
allocate when you created a disk file. When's the last time you worried
about that?


Jul 19 '05 #15
John Machin <sj******@lexicon.net> wrote:
I know I'm going out on a limb by asking this, but why do you think future
software engineers should know about memory management?


Perhaps we have a terminology problem here i.e. different meanings of
"software engineer". Philippe started talking about "CS" courses,
whereas you may be referring to people who have done an "IT" course or
achieved a certification in the use of app development tool X.


No, you've missed the point entirely.

No, the problem is that I'm out on the limb, and you're still comfortably
standing on the ground leaning up against the trunk. Climb up and come out
on the limb with me. Now, stop hugging the trunk and take a few steps out
here with me. Don't worry about how it's swaying, and whatever you do,
don't look down.

The point I was trying to make was that as computer science progresses,
stuff that was really important to know a lot about becomes more and more
taken for granted. This is how we make progress.

I used to worry about memory busses at the milivolt and microsecond level.
I knew about termination impedances and wired-OR logic, and power budgets
and all that good stuff. Today all I know about memory is you go to
www.crucial.com, type in your Visa card number, and the nice UPS guy shows
up with some SIMMs in a few days.

I expect that's all most current CS students know as well. Is that bad?
Is their education somehow lacking because they don't understand why
"memory bus" and "transmission line" belong in the same sentence? Not at
all. All that's happened is that very important stuff has become so
standardized that they don't have to worry about it any more and can spend
their time and energy thinking about other problems that need to be solved
today.

There are lots of really important, hard, theoretical problems that today's
CS majors need to be solving. User interfaces for the most part still
suck. Self-configuring and self-healing high speed networks on a global
scale. AI hasn't really progressed in 30 years. Computer vision and
speech. Robotics. Cryptography and security. And what about flying cars?

Just like you can't even begin to think about building today's GUI-driven
desktop applications if you're still worrying about individual logic gates,
you can't begin to think about solving some of these really hard problems
(and others we haven't even imagined) if you're still worrying about memory
buffer reference counting and garbage collection. Yesterday's research
projects are today's utilities and tomorrow's historical footnotes.
Jul 19 '05 #16
On Sun, 12 Jun 2005, Mike Meyer wrote:
For instance, one problem was "You have two files that have lists of 1
billion names in them. Print out a list of the names that only occur
in one of the files."

That's a one-line shell script: "comm -12 <(sort file_one) <(sort file_two)"


Incidentally, how long does sorting two billion lines of text take?

The complementary question, of course, is "how long does it take to come
up with an algorithm for solving this problem that doesn't involve sorting
the files?"!

the best thing i can come up with off the top of my head is making a pass
over one file to build a Bloom filter [1] describing its contents, then
going over the second file, checking if each name is in the filter, and if
it is, putting it in a hashtable, then making a second pass over the first
file, checking if each name is in the hashtable. this would work without
the filter, but would require storing a billion names in the hashtable;
the idea is that using the filter allows you to cut this down to a
tractable level. that said, i'm not sure if it would work in practice - if
you have a billion names, even if you have a filter a gigabyte in size,
you still have a 2% false positive rate [2], which is 20 million names.

tom

[1] http://en.wikipedia.org/wiki/Bloom_filter
[2] http://www.cc.gatech.edu/fac/Pete.Ma...alculator.html

--
Think logical, act incremental
Jul 19 '05 #17
On Sun, 12 Jun 2005 08:11:47 -0400, Roy Smith wrote:
The point I was trying to make was that as computer science progresses,
stuff that was really important to know a lot about becomes more and more
taken for granted. This is how we make progress.

I used to worry about memory busses at the milivolt and microsecond level.
I knew about termination impedances and wired-OR logic, and power budgets
and all that good stuff. Today all I know about memory is you go to
www.crucial.com, type in your Visa card number, and the nice UPS guy shows
up with some SIMMs in a few days.
Yes. But (to a first approximation) memory either works or it doesn't. And
we never need to worry about it scaling, because you don't get to assemble
your own SIMMs -- you buy them pre-made. Software is nothing like that.

[snip] Just like you can't even begin to think about building today's
GUI-driven desktop applications if you're still worrying about
individual logic gates, you can't begin to think about solving some of
these really hard problems (and others we haven't even imagined) if
you're still worrying about memory buffer reference counting and garbage
collection. Yesterday's research projects are today's utilities and
tomorrow's historical footnotes.


Nice in theory, but frequently breaks down in practice. Let's take a nice,
real, Python example:

I write an text-handling application in Python. I've taken your advice,
and don't worry about messy details about the language implementation,
and concentrated on the application logic. Consequently, I've used the
idiom:

new_text = ""
for word in text:
new_text = new_text + process(word)

I test it against text containing a few thousand words, and performance is
good. Then my customers use my application in the real world, using texts
of a few hundreds of millions of words, and performance slows to a painful
crawl.

Python does a good job of insulating the developer from the implementation
details, but even in Python those details can sometimes turn around and
bite you on the behind. And your users will discover those bum-biting
situations long before your testing will.

Joel of "Joel On Software" discusses this issue here:

http://www.joelonsoftware.com/articl...000000319.html

Of course, we should not prematurely optimise. But we should also be aware
of the characteristics of the libraries we call, so we can choose the
right library.

Fortunately, a high-level language like Python makes it comparatively easy
to refactor a bunch of slow string concatenations into the list-append
plus string-join idiom.
--
Steven.

Jul 19 '05 #18
Taking stuff for granted in unrelated to progress.

I agree that the "trade" of software engineering evolves and that, thanks to
hardware advances, we _usually_ can now "object orient" our software, add
billions of abstraction layers, and consume memory without a second
thought. But the trade evolves in the sense "sub"-trades are created, one
person becomes a database experts while another will html all of his/her
life (I personally find that sad). I'm being redundant here: The reason we
can use Python and take many issues for granted is because some very
skilled people handle the issues we find cumbersome.

Roy Smith wrote:
The point I was trying to make was that as computer science progresses,
stuff that was really important to know a lot about becomes more and more
taken for granted. This is how we make progress.


Jul 19 '05 #19
Steven D'Aprano <st***@REMOVETHIScyber.com.au> writes:

[snap]
new_text = ""
for word in text:
new_text = new_text + process(word)


new_text = "".join(map(process, text))

(I couldn't resist)
Jul 19 '05 #20
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and tribulations
of memory management (or those other things that Python hides so well).


I'm not sure that postponing learning what memory
is, what a pointer is and others "bare metal"
problems is a good idea. Those concept are not
"more complex" at all, they're just more *concrete*
than the abstract concept of "variable".
Human mind work best moving from the concrete to
the abstract, we first learn counting, and only
later we learn rings (or even set theory).
Unless you think a programmer may live happy
without understanding concrete issues then IMO
the best is to learn concrete facts first, and
only later abstractions.
I think that for a programmer skipping the
understanding of the implementation is just
impossible: if you don't understand how a
computer works you're going to write pretty
silly programs. Note that I'm not saying that
one should understand every possible implementation
down to the bit (that's of course nonsense), but
there should be no room for "magic" in a computer
for a professional programmer.

Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.

Andrea
Jul 19 '05 #21
Andrea Griffini <ag****@tin.it> wrote:
I think that for a programmer skipping the
understanding of the implementation is just
impossible: if you don't understand how a
computer works you're going to write pretty
silly programs. Note that I'm not saying that
one should understand every possible implementation
down to the bit (that's of course nonsense), but
there should be no room for "magic" in a computer
for a professional programmer.
How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?

Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).

Digging down a little deeper, a NAND gate is magic if you don't know how a
transistor works or can't do basic circuit analysis. And transistors are
magic until you dig down to the truly magical stuff that's going on with
charge carriers and electric fields inside a semiconductor junction.
That's about where my brain starts to hurt, but it's also where the quantum
mechanics are just getting warmed up.
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you would
agree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go that
far down. I'm certainly not convinced you need to start there.
Jul 19 '05 #22
Andrea Griffini <ag****@tin.it> writes:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.

Admittedly, at some level the details simply stop mattering. But where
that level is depends on what level you're working on. Writing Python,
I really don't need to understand the behavior of hardware
gates. Writing horizontal microcode, I'm totally f*cked if I don't
understand the behavior of hardware gates.

In short, you're going to start in the middle. You can avoid looking
down if you avoid certain classes of problems - but not everyone will
be able to do that. Since you can only protect some of the students
from this extra confusion, is it really justified to confuse them all
by introducing what are really extraneous details early on?

You've stated your opinion. Personally, I agree with Abelson, Sussman
and Sussman, whose text "The Structure and Interpretation of Computer
Programs" was the standard text at one of the premiere engineering
schools in the world, and is widely regarded as a classic in the
field: they decided to start with the abstract, and deal with concrete
issues - like assignment(!) later.

<mike

*) "My favorite programming langauge is solder." - Bob Pease

--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Jul 19 '05 #23
Mike Meyer wrote:
Andrea Griffini <ag****@tin.it> writes:
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)?


No, Andrea means you need to learn physics, starting perhaps with basic
quantum mechanics and perhaps with some chemistry thrown in (since you
can't really understand semiconductors without understanding how they're
built, right?). Oh, and manufacturing. And a fundamental understanding
of scanning electron microscopes (for inspection) would be helpful as
well. I think probably a Ph.D. level training in mathematics might be a
good start also, since after all this is the foundation of much of
computing. A while later comes the electronics, and then memory management.

Things like while loops and if statements, and *how to actually write a
program* are, of course, only the eventual outcome of all that good
grounding in "the basics" that you need first.

<big wink>

-Peter
Jul 19 '05 #24
Andrea Griffini wrote:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and tribulations
of memory management (or those other things that Python hides so well).


I'm not sure that postponing learning what memory
is, what a pointer is and others "bare metal"
problems is a good idea. ...
I think that for a programmer skipping the
understanding of the implementation is just
impossible: if you don't understand how a
computer works you're going to write pretty
silly programs.


I'm curious how you learned to program. What path worked for you, and
do you think it was a wrong approach, or the right one?

In my case, I started with BASIC. Good old BASIC, with no memory
management to worry about, no pointers, no "concrete" details, just FOR
loops and variables and lots of PRINT statements.

A while (some months) later I stumbled across some assembly language and
-- typing it into the computer like a monkey, with no idea what I was
dealing with -- began learning about some of the more concrete aspects
of computers.

This worked very well in my case, and I strongly doubt I would have
stayed interested in an approach that started with talk of memory
addressing, bits and bytes, registers and opcodes and such.

I won't say that I'm certain about any of this, but I have a very strong
suspicion that the *best* first step in learning programming is a
program very much like the following, which I'm pretty sure was mine:

10 FOR A=1 TO 10: PRINT"Peter is great!": END

And no, I don't recall enough BASIC syntax to be sure that's even
correct, but I'm sure you get my point. In one line I learned
(implicitly at first) about variables, control structures and iteration,
output, and probably a few other things.

More importantly by far, *I made the computer do something*. This
should be everyone's first step in a programming course, and it doesn't
take the slightest understanding of what you call "concrete" things...
(though I'd call these things very concrete, and memory management
"esoteric" or something).

If I had been stuck in a course that made me learn about memory
management before I could write a program, I'm pretty sure I'd be doing
something fascinating like selling jeans in a Levis store...

-Peter
Jul 19 '05 #25
"Mike Meyer" wrote:
Andrea Griffini <ag****@tin.it> writes:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.


This may sound as a rhetorical question, but in fact as an Informatics
undergrad I had to take courses in electronics, logic design, signals
and systems and other obscure courses as far CS is concerned
(http://www2.di.uoa.gr/en/lessons.php). Although these are certainly
useful if one is interested in hardware, architecture, realtime and
embedded systems, etc., I hardly find them relevant (or even more,
necessary) for most CS/IT careers. Separation of concerns works pretty
well for most practical purposes.

George

Jul 19 '05 #26
On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith <ro*@panix.com> wrote:
How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?
They're things that can be IMO genuinely accept
as "obvious". Even "counting" is not the lowest
level in mathematic... there is the mathematic
philosohy direction. From "counting" you can go
"up" in the construction direction (rationals,
reals, functions, continuity and the whole
analysis area) building on the counting concept
or you can go "down" asking yourself what it
does really mean counting, what do you mean
with a "proof", what really is a "set".
However the "counting" is naturally considered
obvious for our minds and you can build the
whole life without the need to look at lower
levels and without getting bitten too badly for
that simplification.

Also lower than memory and data bus there is
of course more stuff (in our universe looks
like there is *always* more stuff no mattere
where you look :-) ), but I would say it's
more about electronic than computer science.
Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).


It's magic if I'm curious but you can't answer
my questions. It's magic if I've to memorize
because I'm not *allowed* to understand.
It's not magic if I can (and naturally do) just
ignore it because I can accept it. It's not
magic if I don't have questions because it's
for me "obvious" enough.
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you would
agree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go that
far down. I'm certainly not convinced you need to start there.


I think that if you don't understand memory,
addresses and allocation and deallocation, or
(roughly) how an hard disk works and what's
the difference between hard disks and RAM then
you're going to be a horrible programmer.

There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.
If those are magic formulas you'll just forget
them and you'll end up writing code that is
thousands times slower than necessary.

If you don't understand *why* "C" needs malloc
then you'll forget about allocating objects.

Andrea
Jul 19 '05 #27
On Sun, 12 Jun 2005 19:53:29 -0500, Mike Meyer <mw*@mired.org> wrote:
Andrea Griffini <ag****@tin.it> writes:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.
So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.


Not really. Long ago I've drawn a line that starts at
software. I think you can be a reasonable programmer
even without the knowledge about how to design hardware.
I do not think you can be a reasonable programmer if
you never saw assembler.
Admittedly, at some level the details simply stop mattering. But where
that level is depends on what level you're working on. Writing Python,
I really don't need to understand the behavior of hardware
gates. Writing horizontal microcode, I'm totally f*cked if I don't
understand the behavior of hardware gates.
But you better understand how, more or less, your
computer or language works, otherwise your code will
be needless thousand times slower and will require
thousand times more memory than is necessary.
Look a recent thread where someone was asking why
python was so slow (and the code contained stuff
like "if x in range(low, high):" in an inner loop
that was itself pointless).
In short, you're going to start in the middle.
I've got "bad" news for you. You're always in the
middle :-D. Apparently it looks like this is a
constant in our universe. Even counting (i.e.
1, 2, 3, ...) is not the "start" of math (you
can go at "lower" levels).
Actually I think this is a "nice" property of our
universe, but discussing this would bring the
discussion a bit OT.
Is it really justified to confuse them all
by introducing what are really extraneous details early on?
I simply say that you will not able to avoid
introducing them. If they're going to write software
those are not "details" that you'll be able to hide
behind a nice and perfect virtual world (this is much
less true about bus cycles... at least for many
programmers).

But if you need to introduce them, then IMO is
way better doing it *first*, because that is the
way that our brain works.

You cannot build on loosely placed bricks.
You've stated your opinion. Personally, I agree with Abelson, Sussman
and Sussman, whose text "The Structure and Interpretation of Computer
Programs" was the standard text at one of the premiere engineering
schools in the world, and is widely regarded as a classic in the
field: they decided to start with the abstract, and deal with concrete
issues - like assignment(!) later.


Sure. I know that many think that starting from
higher levels is better. However no explanation is
given about *why* this should work better, and I
didn't even see objective studies about how this
approach pays off. This is of course not a field
that I've investigated a lot.

What I know is that every single competent programmer
I know (not many... just *EVERY SINGLE ONE*) started
by placing firmly concrete concepts first, and then
moved on higher abstractions (for example like
structured programming, OOP, functional languages ...).

Andrea
Jul 19 '05 #28
> They're things that can be IMO genuinely accept
as "obvious". Even "counting" is not the lowest
level in mathematic... there is the mathematic
philosohy direction. I am personally highly interested in become
aware of the very bottom, the fundaments
all our knownledge is build on.
Trying to answer questions like:
What are the most basic ideas all other
are derived from in mathematics and
programming?
keeps me busy for hours, days, years ...

Any insights you can share with
me(and/or this group)?

Claudio

"Andrea Griffini" <ag****@tin.it> schrieb im Newsbeitrag
news:5q********************************@4ax.com... On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith <ro*@panix.com> wrote:
How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?


They're things that can be IMO genuinely accept
as "obvious". Even "counting" is not the lowest
level in mathematic... there is the mathematic
philosohy direction. From "counting" you can go
"up" in the construction direction (rationals,
reals, functions, continuity and the whole
analysis area) building on the counting concept
or you can go "down" asking yourself what it
does really mean counting, what do you mean
with a "proof", what really is a "set".
However the "counting" is naturally considered
obvious for our minds and you can build the
whole life without the need to look at lower
levels and without getting bitten too badly for
that simplification.

Also lower than memory and data bus there is
of course more stuff (in our universe looks
like there is *always* more stuff no mattere
where you look :-) ), but I would say it's
more about electronic than computer science.
Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).


It's magic if I'm curious but you can't answer
my questions. It's magic if I've to memorize
because I'm not *allowed* to understand.
It's not magic if I can (and naturally do) just
ignore it because I can accept it. It's not
magic if I don't have questions because it's
for me "obvious" enough.
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you wouldagree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go thatfar down. I'm certainly not convinced you need to start there.


I think that if you don't understand memory,
addresses and allocation and deallocation, or
(roughly) how an hard disk works and what's
the difference between hard disks and RAM then
you're going to be a horrible programmer.

There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.
If those are magic formulas you'll just forget
them and you'll end up writing code that is
thousands times slower than necessary.

If you don't understand *why* "C" needs malloc
then you'll forget about allocating objects.

Andrea

Jul 19 '05 #29
On Sun, 12 Jun 2005 21:52:12 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
I'm curious how you learned to program.
An HP RPN calculator, later TI-57. Later Apple ][.
With Apple ][ after about one afternoon spent typing
in a basic program from a magazine I gave up with
basic and started with 6502 assembler ("call -151"
was always how I started my computer sessions).
What path worked for you, and do you think it was
a wrong approach, or the right one?
I was a fourteen with no instructor, when home
computers in my city could be counted on the fingers
of one hand. Having an instructor I suppose would
have made me going incredibly faster. Knowing better
the english language at that time would have made
my life also a lot easier.
I think that anyway it was the right approach in
terms of "path", not the (minimal energy) approach
in terms of method. Surely a lower energy one in
the long run comparing to those that started with
basic and never looked at lower levels.
In my case, I started with BASIC. Good old BASIC, with no memory
management to worry about, no pointers, no "concrete" details, just FOR
loops and variables and lots of PRINT statements.
That's good as an appetizer.
A while (some months) later I stumbled across some assembly language and
-- typing it into the computer like a monkey, with no idea what I was
dealing with -- began learning about some of the more concrete aspects
of computers.
That is IMO a very good starting point. Basically it
was the same I used.
This worked very well in my case, and I strongly doubt I would have
stayed interested in an approach that started with talk of memory
addressing, bits and bytes, registers and opcodes and such.
I think that getting interested in *programming* is
important... it's like building with LEGOs, but at a
logical level. However that is just to get interest...
and a few months with basic is IMO probably too much.
But after you've a target (making computers do what
you want) then you've to start placing solid bricks,
and that is IMO assembler. Note that I think that any
simple assembler is OK... even if you'll end up using
a different processor when working in C it will be
roughly ok. But I see a difference between those that
never (really) saw assembler and those that did.
I won't say that I'm certain about any of this, but I have a very strong
suspicion that the *best* first step in learning programming is a
program very much like the following, which I'm pretty sure was mine:

10 FOR A=1 TO 10: PRINT"Peter is great!": END
Just as a motivation. After that *FORGETTING* that
(for and the "next" you missed) is IMO perfectly ok.
More importantly by far, *I made the computer do something*.


Yes, I agree. But starting from basic and never looking
lower is quit a different idea.

Andrea
Jul 19 '05 #30
On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith wrote:
At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you would
agree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go that
far down. I'm certainly not convinced you need to start there.


The important question is, what are the consequences of that faith when it
is mistaken?

As a Python developer, I probably won't write better code if I understand
how NAND gates work or the quantum mechanics of electrons in solid
crystals. But I will write better code if I understand how Python
implements string concatenation, implicit conversion from ints to longs,
floating point issues, etc.

It seems that hardly a day goes by without some newbie writing to the
newsgroup complaining that "Python has a bug" because they have discovered
that the floating point representation of 0.1 in decimal is actually more
like 0.10000000000000001. And let's not forget the number of bugs out
there because developers thought that they didn't need to concern
themselves with the implementation details of memory management.

It makes a difference whether your algorithm runs in constant time,
linear, quadratic, logarithmic or exponential time -- or something even
slower. The implementation details of the language can hide quadratic or
exponential algorithms in something that looks like a linear or constant
algorithm. Premature optimization is a sin... but so is unusably slow
code.

--
Steven.
Jul 19 '05 #31
"Andrea Griffini" wrote:
I think that if you don't understand memory,
addresses and allocation and deallocation, or
(roughly) how an hard disk works and what's
the difference between hard disks and RAM then
you're going to be a horrible programmer.

There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.


There's a crucial distinction between these two scenarios though: the
first one has to do with today's hardware and software limitations
while the second one expresses fundamental, independent of technology,
algorithmic properties. In the not-too-far-future, the difference
between RAM and hard disks may be less important than today; hard disks
may be fast enough for most purposes, or the storage policy may be
mainly decided by the OS, the compiler, the runtime system or a library
instead of the programmer (similarly to memory management being
increasingly based on garbage collection). As programmers today don't
have to know or care much about register allocation, future programmers
may not have to care about whether something is stored in memory or in
disk. OTOH, an algorithm or problem with exponential complexity will
always be intractable for sufficiently large input, no matter how fast
processors become. The bottom line is that there is both fundamental
and contemporary knowledge, and although one needs to be good at both
at any given time, it's useful to distinguish between them.

George

Jul 19 '05 #32
Andrea Griffini <ag****@tin.it> writes:
In short, you're going to start in the middle.
I've got "bad" news for you. You're always in the
middle :-D.


That's what I just said.
Is it really justified to confuse them all
by introducing what are really extraneous details early on?


I simply say that you will not able to avoid
introducing them. If they're going to write software
those are not "details" that you'll be able to hide
behind a nice and perfect virtual world (this is much
less true about bus cycles... at least for many
programmers).


I disagree. If you're going to make competent programmers of them,
they need to know the *cost* of those details, but not necessarily the
actual details themselves. It's enough to know that malloc may lead to
a context switch; you don't need to know how malloc actually works.
But if you need to introduce them, then IMO is
way better doing it *first*, because that is the
way that our brain works.
That's the way *your* brain works. I'd not agree that mine works that
way. Then again, proving either statement is an interesting
proposition.

You've stated your opinion. Personally, I agree with Abelson, Sussman
and Sussman, whose text "The Structure and Interpretation of Computer
Programs" was the standard text at one of the premiere engineering
schools in the world, and is widely regarded as a classic in the
field: they decided to start with the abstract, and deal with concrete
issues - like assignment(!) later.


Sure. I know that many think that starting from
higher levels is better. However no explanation is
given about *why* this should work better, and I
didn't even see objective studies about how this
approach pays off. This is of course not a field
that I've investigated a lot.


The explanation has been stated a number of times: because you're
letting them worry about learning how to program, before they worry
about learning how to evaluate the cost of a particular
construct. Especially since the latter depends on implementation
details, which are liable to have to be relearned for every different
platform.
What I know is that every single competent programmer
I know (not many... just *EVERY SINGLE ONE*) started
by placing firmly concrete concepts first, and then
moved on higher abstractions (for example like
structured programming, OOP, functional languages ...).


I don't normally ask how people learned to program, but I will observe
that most of the CS courses I've been involved with put aside concrete
issues - like memory management - until later in the course, when it
was taught as part of an OS internals course. The exception would be
those who were learning programming as part of an engineering (but not
software engineering) curriculum. The least readable code examples
almost uniformly came from the latter group.

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
Jul 19 '05 #33
On Mon, Jun 13, 2005 at 06:13:13AM +0000, Andrea Griffini wrote:
Andrea Griffini <ag****@tin.it> writes:
So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways. Yep. Probably. Without a basic understanding of hardware design, one cannot
many of todays artifacts: Like longer pipelines and what does this
mean to the relative performance of different solutions.

Or how does one explain that a "stupid and slow" algorithm can be in
effect faster than a "clever and fast" algorithm, without explaining
how a cache works. And what kinds of caches there are. (I've seen
documented cases where a stupid search was faster because all hot data
fit into the L1 cache of the CPU, while more clever algorithms where
slower).

So yes, one needs a basic understanding of hardware, so that one can
understand the design of "assembly". And without knowledge of these
you get C programmers that do not really understand what their
programs do. (Be it related to calling sequences, portability of their
code, etc.) Again you can sometimes see developers that pose questions
that suggest that they do not know about the lowlevel. (Example from a
current project: Storing booleans in a struct-bit-field so that it's
faster. Obviously such a person never seen the code needed to
manipulate bit fields on most architectures.)

A good C programmer needs to know about assembly, libc (stuff like
malloc and friends and the kernel API).

Now a good python programmer needs to know at least a bit about the
implementation of python. (Be it CPython or Jython).

So yes, one needs to know the underlying layers, if not by heart, than
at least on a "I-know-which-book-to-consult" level.

Or you get perfect abstract designs, that are horrible when
implemented.
Not really. Long ago I've drawn a line that starts at
software. I think you can be a reasonable programmer
even without the knowledge about how to design hardware. Well, IMHO one needs to know at least a bit. But one doesn't need to
know it well enough to be able to design hardware by himself. ;)
I do not think you can be a reasonable programmer if
you never saw assembler.
Admittedly, at some level the details simply stop mattering. But where
that level is depends on what level you're working on. Writing Python,
I really don't need to understand the behavior of hardware


Yes. But for example to understand the memory behaviour of Python
understanding C + malloc + OS APIs involved is helpful.
gates. Writing horizontal microcode, I'm totally f*cked if I don't
understand the behavior of hardware gates.


But you better understand how, more or less, your
computer or language works, otherwise your code will
be needless thousand times slower and will require
thousand times more memory than is necessary.
Look a recent thread where someone was asking why
python was so slow (and the code contained stuff
like "if x in range(low, high):" in an inner loop
that was itself pointless).


Andreas
Jul 19 '05 #34
Andrea Griffini schrieb:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:

I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and tribulations
of memory management (or those other things that Python hides so well).

I'm not sure that postponing learning what memory
is, what a pointer is and others "bare metal"
problems is a good idea.


I think Peter is right. Proceeding top-down is the natural way of
learning (first learn about plants, then proceed to cells, molecules,
atoms and elementary particles). If you learn a computer language
you have to know about variables, of course. You have to know that
they are stored in memory. It is even useful to know about variable
address and variable contents but this doesn't mean that you have
to know about memory management. MM is a low level problem that has
to do with the internals of a computer system and shouldn't be part
of a first *language* course.

The concepts of memory, data and addresses can easily be demonstrated
in high level languages including python e.g. by using a large string
as a memory model. Proceeding to bare metal will follow driven by
curiosity.

--
-------------------------------------------------------------------
Peter Maas, M+R Infosysteme, D-52070 Aachen, Tel +49-241-93878-0
E-mail 'cGV0ZXIubWFhc0BtcGx1c3IuZGU=\n'.decode('base64')
-------------------------------------------------------------------
Jul 19 '05 #35
On Sun, 12 Jun 2005, Roy Smith wrote:
Andrea Griffini <ag****@tin.it> wrote:
I think that for a programmer skipping the understanding of the
implementation is just impossible: if you don't understand how a
computer works you're going to write pretty silly programs. Note that
I'm not saying that one should understand every possible implementation
down to the bit (that's of course nonsense), but there should be no
room for "magic" in a computer for a professional programmer.


How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?

Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).

Digging down a little deeper, a NAND gate is magic if you don't know how a
transistor works or can't do basic circuit analysis. And transistors are
magic until you dig down to the truly magical stuff that's going on with
charge carriers and electric fields inside a semiconductor junction.
That's about where my brain starts to hurt, but it's also where the quantum
mechanics are just getting warmed up.


It's all true - i wouldn't be the shit-hot programmer i am today if i
hadn't done that A-level physics project on semiconductors.

tom

--
Think logical, act incremental
Jul 19 '05 #36
On Sun, 12 Jun 2005, Peter Hansen wrote:
Andrea Griffini wrote:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and
tribulations of memory management (or those other things that Python
hides so well).
I'm not sure that postponing learning what memory is, what a pointer is
and others "bare metal" problems is a good idea. ... I think that for a
programmer skipping the understanding of the implementation is just
impossible: if you don't understand how a computer works you're going
to write pretty silly programs.


I won't say that I'm certain about any of this, but I have a very strong
suspicion that the *best* first step in learning programming is a program
very much like the following, which I'm pretty sure was mine:

10 FOR A=1 TO 10: PRINT"Peter is great!": END


10 PRINT "TOM IS ACE"
20 GOTO 10

The first line varies, but i suspect the line "20 GOTO 10" figures
prominently in the early history of a great many programmers.
More importantly by far, *I made the computer do something*.


Bingo. When you realise you can make the computer do things, it
fundamentally changes your relationship with it, and that's the beginning
of thinking like a programmer.

tom

--
Think logical, act incremental
Jul 19 '05 #37
On Mon, 13 Jun 2005, Andrea Griffini wrote:
On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith <ro*@panix.com> wrote:
Also concrete->abstract shows a clear path; starting in the middle and
looking both up (to higher abstractions) and down (to the
implementation details) is IMO much more confusing.


At some point, you need to draw a line in the sand (so to speak) and
say, "I understand everything down to *here* and can do cool stuff with
that knowledge. Below that, I'm willing to take on faith".


I think that if you don't understand memory, addresses and allocation
and deallocation, or (roughly) how an hard disk works and what's the
difference between hard disks and RAM then you're going to be a horrible
programmer.

There's no way you will remember what is O(n), what O(1) and what is
O(log(n)) among containers unless you roughly understand how it works.
If those are magic formulas you'll just forget them and you'll end up
writing code that is thousands times slower than necessary.


I don't buy that. I think there's a world of difference between knowing
what something does and how it does it; a black-box view of the memory
system (allocation + GC) is perfectly sufficient as a basis for
programming using it. That black-box view should include some idea of how
long the various operations take, but it's not necessary to understand how
it works, or even how pointers work, to have this.

tom

--
Think logical, act incremental
Jul 19 '05 #38
Andrea Griffini <ag****@tin.it> wrote:
There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.


People were thinking about algorithmic complexity before there was random
access memory. Back in the unit record equipment (i.e. punch card) days,
people were working out the best ways to sort and merge decks of punch
cards with the fewest trips through the sorting machine. Likewise for data
stored on magnetic tape.

I can certainly demonstrate algorithmic complexity without ever going
deeper than the level of abstraction exposed by Python. You can learn
enough Python in an afternoon to write a bubble sort and start learning
about O(2) behavior without even knowing what a memory address is.

Somebody mentioned that string addition in Python leads to O(2) behavior.
Yes it does, but that's more an artifact of how Guido decided he wanted
strings to work than anything fundamental about memory allocation. He
could have taken a different design path and made Python strings more like
STL vectors, in which case string addition would be O(n). Teaching that
"string addition is O(2)" is not only needlessly confusing for somebody
just starting out, it's also wrong (or at best, a specific case valid for
one particular implementation).

And, BTW, I started out programming on a big HP desktop calculator
(http://www.hpmuseum.org/hp9810.htm). Next came BASIC. Then Fortan and
assembler on a pdp-10. Then C, a couple of years later. After that, I've
lost track. Some of the languages that taught me the most were ones that
got very far away from the hardware. NewtonScript was my first
introduction to OOPL, and PostScript showed me that stack languages aren't
just for calculators. Lisp, of course, expanded my mind in ways that only
Lisp can (the same could be said for many things I tried back in those
days). Even quirky HyperCard showed me a different way to think about
programming.

I think it's probably just as important for a CS major to play with those
mind-altering languages as it is to worry about bytes and pointers and
memory locations. But you can't start everywhere, and if you've got to
start someplace, Python let's you concentrate on the real universal
fundamentals of data structures, algorithms, and control flow without
getting bogged down in details.
Jul 19 '05 #39
Le Mon, 13 Jun 2005 07:53:03 -0400, Roy Smith a écrit :
Python let's you concentrate on the real universal
fundamentals of data structures, algorithms, and control flow without
getting bogged down in details.


+1 QOTW
Jul 19 '05 #40
On Mon, 13 Jun 2005, Roy Smith wrote:
O(2) behavior
Um ...
Lisp, of course, expanded my mind in ways that only Lisp can (the same
could be said for many things I tried back in those days).
Surely you're not saying you experimented with ... APL?
I think it's probably just as important for a CS major to play with
those mind-altering languages as it is to worry about bytes and pointers
and memory locations. But you can't start everywhere, and if you've got
to start someplace, Python let's you concentrate on the real universal
fundamentals of data structures, algorithms, and control flow without
getting bogged down in details.


Ah, so you've cleaned yourself up with Guido's Twelve-Step Plan. Amen to
that, brother!

tom

--
Why do we do it? - Exactly!
Jul 19 '05 #41
On Monday 13 June 2005 12:55 am, Andrea Griffini wrote:
On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith <ro*@panix.com> wrote:
How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?


They're things that can be IMO genuinely accept
as "obvious".


Hah!

Try explaining them to my non-programmer mother or
my 9-year-old son. On the other hand, telling them that
Python attaches a label (or name) to an object (which can
be "anything") was a cinch. Both want to program, but
are currently still struggling with basic concepts.

Interestingly, my son had no problem at all with the "name"
versus "variable" distinction -- that seems to be a case where
my C experience caused me problems, but it's a non-issue
coming from a tabula rasa perspective.

--
Terry Hancock ( hancock at anansispaceworks.com )
Anansi Spaceworks http://www.anansispaceworks.com

Jul 19 '05 #42
there should be no room for "magic" in a computer
for a professional programmer.

well put. sounds like the makings of a good signature...
Jul 19 '05 #43
> So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.
Some very good schools still believe that
Mike Meyer wrote:
Andrea Griffini <ag****@tin.it> writes:
On Sat, 11 Jun 2005 21:52:57 -0400, Peter Hansen <pe***@engcorp.com>
wrote:
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.


So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.

Admittedly, at some level the details simply stop mattering. But where
that level is depends on what level you're working on. Writing Python,
I really don't need to understand the behavior of hardware
gates. Writing horizontal microcode, I'm totally f*cked if I don't
understand the behavior of hardware gates.

In short, you're going to start in the middle. You can avoid looking
down if you avoid certain classes of problems - but not everyone will
be able to do that. Since you can only protect some of the students
from this extra confusion, is it really justified to confuse them all
by introducing what are really extraneous details early on?

You've stated your opinion. Personally, I agree with Abelson, Sussman
and Sussman, whose text "The Structure and Interpretation of Computer
Programs" was the standard text at one of the premiere engineering
schools in the world, and is widely regarded as a classic in the
field: they decided to start with the abstract, and deal with concrete
issues - like assignment(!) later.

<mike

*) "My favorite programming langauge is solder." - Bob Pease


Jul 19 '05 #44
> I don't buy that. I think there's a world of difference between knowing
what something does and how it does it; a black-box view of the memory
system (allocation + GC) is perfectly sufficient as a basis for
programming using it. That black-box view should include some idea of how
long the various operations take, but it's not necessary to understand how
it works, or even how pointers work, to have this.

Maybe you should say programming at the application level. Also if you will
notice from this newsgroup that people are sometimes trying to figure out
how to optimize the speed of their application; you will also notice that
the answers they get usually involve how Python is implemeted in their
specific environment.
I like analogies: twice in my career in and two different companies in two
different industries, it was decided that the application should be
prototyped on workstations and then ported to the embedded environment.
Both times, the speed/size of the "ported" code was so bad that it was
virtually unsusable. It was then decided to spend some time (it took years)
optimizing the system(s) to make it feasible.

I am conviced that if some of the target constraints had been taken into
consideration from the beginning:
1) much less time would have been spent in the optimization process.
2) the architecture of the final piece of code would have been cleaner.

Assuming I am correct, this implies that the folks working on the initial
prototypes should fully understand the constraints of a realtime embedded
environment (and that includes memory management)

Regards,

Philippe


Tom Anderson wrote:
On Mon, 13 Jun 2005, Andrea Griffini wrote:
On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith <ro*@panix.com> wrote:
Also concrete->abstract shows a clear path; starting in the middle and
looking both up (to higher abstractions) and down (to the
implementation details) is IMO much more confusing.

At some point, you need to draw a line in the sand (so to speak) and
say, "I understand everything down to *here* and can do cool stuff with
that knowledge. Below that, I'm willing to take on faith".


I think that if you don't understand memory, addresses and allocation
and deallocation, or (roughly) how an hard disk works and what's the
difference between hard disks and RAM then you're going to be a horrible
programmer.

There's no way you will remember what is O(n), what O(1) and what is
O(log(n)) among containers unless you roughly understand how it works.
If those are magic formulas you'll just forget them and you'll end up
writing code that is thousands times slower than necessary.


I don't buy that. I think there's a world of difference between knowing
what something does and how it does it; a black-box view of the memory
system (allocation + GC) is perfectly sufficient as a basis for
programming using it. That black-box view should include some idea of how
long the various operations take, but it's not necessary to understand how
it works, or even how pointers work, to have this.

tom


Jul 19 '05 #45
Andrea Griffini a écrit :
(snip)
What I know is that every single competent programmer
I know (not many... just *EVERY SINGLE ONE*) started
by placing firmly concrete concepts first, and then
moved on higher abstractions (for example like
structured programming, OOP, functional languages ...).


I don't know if I qualify as a "competent programmer" (you'd have to ask
my co-workers), but I started with hi-level scripting languages,
event-driven programming, and OO. Only then did I learn lower-level
languages (C, Pascal, and bits of 68k assembly). Being familiar with
fondamental *programming* concepts like vars, branching, looping and
functions proved to be helpful when learning C, since I only had then to
focus on pointers and memory management.
Jul 19 '05 #46
In article <OP***************@newssvr30.news.prodigy.com>,
Philippe C. Martin <ph******@philippecmartin.com> wrote:
So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.


Some very good schools still believe that


Historically, CS departments either grew out of EE departments or Math
departments. That happened back in the 60's and 70's, but I suspect
they still tend to show their roots in how they teach.
Jul 19 '05 #47
Peter Maas wrote:
I think Peter is right. Proceeding top-down is the natural way of
learning (first learn about plants, then proceed to cells, molecules,
atoms and elementary particles).


Why in the world is that way "natural"? I could see how biology
could start from molecular biology - how hereditary and self-regulating
systems work at the simplest level - and using that as the scaffolding
to describe how cells and multi-cellular systems work.

Plant biology was my least favorite part of my biology classes. In
general I didn't like the "learn the names of all these parts" approach
of biology. Physics, with its more directly predictive view of the world,
was much more interesting. It wasn't until college when I read some
Stephen J. Gould books that I began to understand that biology was
different than "'the mitochondria is the powerhouse of the cell', here's
the gall bladder, that plant's a dicot, this is a fossilized trilobite."

Similarly, programming is about developing algorithmic thought.
A beginner oriented programming language should focus on that, and
minimize the other details.

Restating my belief in a homologous line: proceeding from simple to
detailed is the most appropriate way of learning. Of course in some
fields even the simplest form takes a long time to understand, but
programming isn't string theory.

Andrew
da***@dalkescientific.com

Jul 19 '05 #48
On Mon, 13 Jun 2005 09:22:55 +0200, Andreas Kostyrka
<an*****@kostyrka.org> wrote:
Yep. Probably. Without a basic understanding of hardware design, one cannot
many of todays artifacts: Like longer pipelines and what does this
mean to the relative performance of different solutions.
I think that pipeline stalls, CPU/FPU parallel computations
and cache access optimization is the lowest level I ever
had to swim in (it was when I was working in the videogame
industry, on software 3D rendering with early pentiums).
Something simpler but somewhat similar was writing on
floppy disks on the Apple ][ where there was no timer
at all in the computer excluding the CPU clock and the
code for writing was required to output a new nibble for
the writing latch exactly every 40 CPU cycles (on the
Apple ][ the CPU was doing everything, including
controlling the stepper motor for disk seek).

However I do not think that going this low (that's is still
IMO just a bit below assembler and still quite higher than
HW design) is very common for programmers.
Or how does one explain that a "stupid and slow" algorithm can be in
effect faster than a "clever and fast" algorithm, without explaining
how a cache works. And what kinds of caches there are. (I've seen
documented cases where a stupid search was faster because all hot data
fit into the L1 cache of the CPU, while more clever algorithms where
slower).
Caching is indeed very important, and sometimes the difference
is huge. I think anyway that it's probably something confined
in a few cases (processing big quantity of data with simple
algorithms, e.g. pixel processing).
It's also a field where if you care about the details the
specific architecture plays an important role, and anything
you learned about say the Pentium III could be completely
pointless on the Pentium 4.

Except by general locality rules I would say that everything
else should be checked only if necessary and on a case-by-case
approach. I'm way a too timid investor to throw in neurons
on such a volatile knowledge.
Or you get perfect abstract designs, that are horrible when
implemented.
Current trend is that you don't even need to do a
clear design. Just draw some bubbles and arrows on
a white board with a marker, throw in some buzzword
and presto! you basically completed the new killing app.
Real design and implementation are minutiae for bozos.

Even the mighty python is incredibly less productive
than powerpoint ;-)
Yes. But for example to understand the memory behaviour of Python
understanding C + malloc + OS APIs involved is helpful.


This is a key issue. If you've the basis firmly placed
most of what follows will be obvious. If someone tells
you that inserting an element at the beginning of an array
is O(n) in the number of elements then you think "uh... ok,
sounds reasonable", if they say that it's amortized O(1)
instead then you say "wow..." and after some thinking
"ok, i think i understand how it could be done" and in
both cases you'll remember it. It's a clear *concrete* fact
that I think just cannot be forgot.

If O(1) and O(n) and how dynamic arrays could be possibly
be implemented is just black magic and a few words in a
text for you then IMO you'll never be able to remember
what that implies, and you'll do soon or late something
really really stupid about it in your programs.

Andrea
Jul 19 '05 #49
On Mon, 13 Jun 2005 13:35:00 +0200, Peter Maas <pe***@somewhere.com>
wrote:
I think Peter is right. Proceeding top-down is the natural way of
learning.
Depends if you wanna build or investigate.

To build top down is the wrong approach (basically because
there's no top). Top down is however great for *explaining*
what you already built or know.
(first learn about plants, then proceed to cells, molecules,
atoms and elementary particles).
This is investigating. Programming is more similar to building
instead (with a very few exceptions). CS is not like physics or
chemistry or biology where you're given a result (the world)
and you're looking for the unknown laws. In programming *we*
are building the world. This is a huge fundamental difference!
If you learn a computer language you have to know about variables,
of course.
There are no user defined variables in assembler.
Registers of a CPU or of a programmable calculator
are easier to understand because they're objectively
simpler concepts. Even things like locality of scope
will be appreciated and understood better once you
try to live with just a global scope for a while.
The concepts of memory, data and addresses can easily be demonstrated
in high level languages including python e.g. by using a large string
as a memory model. Proceeding to bare metal will follow driven by
curiosity.


Hehehe... a large python string is a nice idea for modelling
memory. This shows clearly what I mean with that without firm
understanding of the basis you can do pretty huge and stupid
mistakes (hint: strings are immutable in python... ever
wondered what does that fancy word mean ?)

Andrea
Jul 19 '05 #50

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

220
by: Brandon J. Van Every | last post by:
What's better about Ruby than Python? I'm sure there's something. What is it? This is not a troll. I'm language shopping and I want people's answers. I don't know beans about Ruby or have...
54
by: Brandon J. Van Every | last post by:
I'm realizing I didn't frame my question well. What's ***TOTALLY COMPELLING*** about Ruby over Python? What makes you jump up in your chair and scream "Wow! Ruby has *that*? That is SO...
3
by: Chris Cioffi | last post by:
I started writing this list because I wanted to have definite points to base a comparison on and as the starting point of writing something myself. After looking around, I think it would be a...
7
by: Jonathan Fine | last post by:
Giudo has suggested adding optional static typing to Python. (I hope suggested is the correct word.) http://www.artima.com/weblogs/viewpost.jsp?thread=85551 An example of the syntax he proposes...
92
by: Reed L. O'Brien | last post by:
I see rotor was removed for 2.4 and the docs say use an AES module provided separately... Is there a standard module that works alike or an AES module that works alike but with better encryption?...
4
by: MrBlueSky | last post by:
Hello! I've just finished working on my first Python app (a Tkinter-based program that displays the content of our application log files in graphical format). It was a great experience that's...
34
by: emrahayanoglu | last post by:
Hello Everyone, Now, I'm working on a new web framework. I tried many test on the other programming languages. Then i decided to use python on my web framework project. Now i want to listen...
0
by: MeoLessi9 | last post by:
I have VirtualBox installed on Windows 11 and now I would like to install Kali on a virtual machine. However, on the official website, I see two options: "Installer images" and "Virtual machines"....
0
by: DolphinDB | last post by:
The formulas of 101 quantitative trading alphas used by WorldQuant were presented in the paper 101 Formulaic Alphas. However, some formulas are complex, leading to challenges in calculation. Take...
0
by: Aftab Ahmad | last post by:
Hello Experts! I have written a code in MS Access for a cmd called "WhatsApp Message" to open WhatsApp using that very code but the problem is that it gives a popup message everytime I clicked on...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: marcoviolo | last post by:
Dear all, I would like to implement on my worksheet an vlookup dynamic , that consider a change of pivot excel via win32com, from an external excel (without open it) and save the new file into a...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
0
by: ArrayDB | last post by:
The error message I've encountered is; ERROR:root:Error generating model response: exception: access violation writing 0x0000000000005140, which seems to be indicative of an access violation...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.