472,127 Members | 1,849 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,127 software developers and data experts.

Interesting bug

I just fixed a bug that some of the correctness pedants around here may
find useful as ammunition.
The problem was that some code would, very occasionally, die with a
segmentation violation error. (Not as infrequent as some bugs that
have been discussed here in the past, but maybe once in an overnight
run of the program when it was configured to aggressively exercise the
section that the bug was in.) It was easy enough to trap the error
(using compiler features that are beyond the scope of this newsgroup)
and retry, and the retry would always work, but the glitch was still
Annoying to me and one other correctness pedant I work with.

Turns out that the offending code looked something like this:

for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;

array was a pointer to a malloc'd array of num structs; check() was
an expression that compared a member of the struct to another value.
(If the values matched, we didn't need to do anything more in the
next-outer loop.)

But if the for loop failed to exit abnormally, the check after it exited
would attempt to read the value just past the end of the malloc'd space.

My best guess about what the problem was is that normally this wasn't a
problem, since the bogus value was Highly Unlikely to match what it was
being checked against and the program was allowed to read that memory
(and didn't try to write to it), but occasionally the mallocd space would
be just at the high end of the process's memory space and attempting to
read a few bytes past it would access memory that the process didn't own,
and the OS would trap it.
This may be useful the next time somebody comes around and tries to
claim that "It works on my system" or something similar.
Comments?
dave

--
Dave Vandervies dj******@csclub.uwaterloo.ca
Welcome to comp.lang.c. People here are picky. That's good - it means you can
(generally) trust the answers you get - or at least, you can trust the answer
that emerges after a couple of days of bickering. --Richard Heathfield in CLC
Nov 14 '05 #1
56 3843
Dave Vandervies <dj******@csclub.uwaterloo.ca> scribbled the following:
I just fixed a bug that some of the correctness pedants around here may
find useful as ammunition.
(snip)
Turns out that the offending code looked something like this: for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;
(snip)
Comments?


As soon as I saw that code (and before I saw your explanation of what it
did) alarm bells went off my head. "If that for loop exits normally,
array[i] will be out of bounds." I'm amazed none of your colleagues
managed to spot it. It's like a flashing red light and a siren saying
(effectively) "Danger, Will Robinson".

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"The question of copying music from the Internet is like a two-barreled sword."
- Finnish rap artist Ezkimo
Nov 14 '05 #2
[much snippage]
Dave Vandervies <dj******@csclub.uwaterloo.ca> scribbled the following:
for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;

In article <news:c6**********@oravannahka.helsinki.fi>
Joona I Palaste <pa*****@cc.helsinki.fi> writes:As soon as I saw that code (and before I saw your explanation of what it
did) alarm bells went off my head. "If that for loop exits normally,
array[i] will be out of bounds." I'm amazed none of your colleagues
managed to spot it. It's like a flashing red light and a siren saying
(effectively) "Danger, Will Robinson".


Errors are often a great deal easier to spot after all the irrelevant
distractions have been removed. :-)

It is also easy to read what *should* have been written, rather
than what was actually written. This is particular true if the
debugging is being done by the original programmer.
--
In-Real-Life: Chris Torek, Wind River Systems
Salt Lake City, UT, USA (40°39.22'N, 111°50.29'W) +1 801 277 2603
email: forget about it http://web.torek.net/torek/index.html
Reading email is like searching for food in the garbage, thanks to spammers.
Nov 14 '05 #3
In article <c6**********@oravannahka.helsinki.fi>,
Joona I Palaste <pa*****@cc.helsinki.fi> wrote:
Dave Vandervies <dj******@csclub.uwaterloo.ca> scribbled the following:
I just fixed a bug that some of the correctness pedants around here may
find useful as ammunition.


(snip)
Turns out that the offending code looked something like this:

for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;


(snip)
Comments?


As soon as I saw that code (and before I saw your explanation of what it
did) alarm bells went off my head. "If that for loop exits normally,
array[i] will be out of bounds." I'm amazed none of your colleagues
managed to spot it. It's like a flashing red light and a siren saying
(effectively) "Danger, Will Robinson".


So was I, once I found it. (It's incredible how much easier these
problems are to find when you've narrowed them down to a few lines of
code.) The problem was that we had another 2500 lines of code in that
module that were doing all sorts of interesting things with pointers,
so we were looking for problems with losing track of the pointers there,
not looking for things like this one. (But I figured that neither
the newsgroup nor my employer would be terribly impressed if I posted
2500 lines of code in my description of the problem instead of just a
paraphrase of the snippet where the bug actually was.)

Once we had the time (because we were looking for something else in that
part of the code anyways) to add checkpoints every few lines and I saw
which checkpoints the problem was happening between, it took about five
minutes to identify and fix the problem.
dave

--
Dave Vandervies dj******@csclub.uwaterloo.ca
[S]ome of us take a little convincing that our bugs are in fact bugs, which
is why we have such delightfully energetic discussions on occasion.
--Richard Heathfield in comp.lang.c
Nov 14 '05 #4
Chris Torek <no****@torek.net> spoke thus:
Errors are often a great deal easier to spot after all the irrelevant
distractions have been removed. :-)


The most aggravating are stupid typos - I had a recent bug caused by
misspelling "error" as "errror" that had to be pointed out to me
because I could not grasp such a simple mistake ;)

--
Christopher Benson-Manica | I *should* know what I'm talking about - if I
ataru(at)cyberspace.org | don't, I need to know. Flames welcome.
Nov 14 '05 #5
Dave Vandervies wrote:
.... snip ...
Turns out that the offending code looked something like this:

for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;

array was a pointer to a malloc'd array of num structs; check()
was an expression that compared a member of the struct to another
value. (If the values matched, we didn't need to do anything more
in the next-outer loop.)

But if the for loop failed to exit abnormally, the check after it
exited would attempt to read the value just past the end of the
malloc'd space.


So why didn't the code read:

while (somecondition) {
for (i = 0; i < num; i++)
if (check(array[i])) break;
if (i < num) continue;
/* check(array[i]) must be false for all i < num */
}

--
A: Because it fouls the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
Nov 14 '05 #6
In article <c6**********@chessie.cirr.com>,
Christopher Benson-Manica <at***@nospam.cyberspace.org> wrote:
Chris Torek <no****@torek.net> spoke thus:
Errors are often a great deal easier to spot after all the irrelevant
distractions have been removed. :-)


The most aggravating are stupid typos - I had a recent bug caused by
misspelling "error" as "errror" that had to be pointed out to me
because I could not grasp such a simple mistake ;)


I always wonder how many switch statements contain a

defualt:

label. And it is legal C as well.
Nov 14 '05 #7
In article <40***************@yahoo.com>,
CBFalconer <cb********@yahoo.com> wrote:
Dave Vandervies wrote:

... snip ...

Turns out that the offending code looked something like this:

for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;

array was a pointer to a malloc'd array of num structs; check()
was an expression that compared a member of the struct to another
value. (If the values matched, we didn't need to do anything more
in the next-outer loop.)

But if the for loop failed to exit abnormally, the check after it
exited would attempt to read the value just past the end of the
malloc'd space.


So why didn't the code read:

while (somecondition) {
for (i = 0; i < num; i++)
if (check(array[i])) break;
if (i < num) continue;
/* check(array[i]) must be false for all i < num */
}


Sometimes the end condition is not that simple; the posted code seems to
be several hundred lines of code, condensed down to the actual error.
But it would be simple to introduce a boolean variable that is set to
TRUE when you break from the loop, and I would probably do that in cases
when the logic of the loop itself gets complicated.
Nov 14 '05 #8
"Christian Bau" <ch***********@cbau.freeserve.co.uk> wrote in message
news:ch*********************************@slb-newsm1.svr.pol.co.uk...
In article <c6**********@chessie.cirr.com>,
Christopher Benson-Manica <at***@nospam.cyberspace.org> wrote:
Chris Torek <no****@torek.net> spoke thus:
Errors are often a great deal easier to spot after all the irrelevant
distractions have been removed. :-)


The most aggravating are stupid typos - I had a recent bug caused by
misspelling "error" as "errror" that had to be pointed out to me
because I could not grasp such a simple mistake ;)


I always wonder how many switch statements contain a

defualt:

label. And it is legal C as well.


But at least one compiler can pick it up...

% gcc -Wall default.c
default.c: In function `main':
default.c:8: warning: label `defualt' defined but not used

--
Peter
Nov 14 '05 #9
Christian Bau wrote:

In article <40***************@yahoo.com>,
CBFalconer <cb********@yahoo.com> wrote:
So why didn't the code read:

while (somecondition) {
for (i = 0; i < num; i++)
if (check(array[i])) break;
if (i < num) continue;
/* check(array[i]) must be false for all i < num */
}


Sometimes the end condition is not that simple; the posted code seems to
be several hundred lines of code, condensed down to the actual error.
But it would be simple to introduce a boolean variable that is set to
TRUE when you break from the loop, and I would probably do that in cases
when the logic of the loop itself gets complicated.


This is an irksome weakness of C (and most other languages
I've programmed in): A loop with multiple termination conditions
gives no direct evidence of which condition actually caused it
to terminate. You find yourself at the statement after the loop
with no notion of how you got there, and you usually wind up
either re-testing a condition already tested (ick) or testing
a state flag that summarizes the earlier test (ick again).

Some languages have an "alternate exit" formalism for such
situations, but in most that I have encountered it looked an
awful lot like an unrestricted goto. Some languages have
"exception" mechanisms, but it's usually been a fairly heavy-
weight construct, too expensive for routine use ("exceptional"
shouldn't be "routine").

However, C lacks such a facility. It follows that nearly
every loop with multiple termination conditions should be
followed promptly by a test; if there's no test, there's most
likely a bug.

--
Er*********@sun.com
Nov 14 '05 #10
Eric Sosman wrote:
Christian Bau wrote:
CBFalconer <cb********@yahoo.com> wrote:
So why didn't the code read:

while (somecondition) {
for (i = 0; i < num; i++)
if (check(array[i])) break;
if (i < num) continue;
/* check(array[i]) must be false for all i < num */
}


Sometimes the end condition is not that simple; the posted code
seems to be several hundred lines of code, condensed down to the
actual error. But it would be simple to introduce a boolean
variable that is set to TRUE when you break from the loop, and I
would probably do that in cases when the logic of the loop
itself gets complicated.


This is an irksome weakness of C (and most other languages
I've programmed in): A loop with multiple termination conditions
gives no direct evidence of which condition actually caused it
to terminate. You find yourself at the statement after the loop
with no notion of how you got there, and you usually wind up
either re-testing a condition already tested (ick) or testing
a state flag that summarizes the earlier test (ick again).

Some languages have an "alternate exit" formalism for such
situations, but in most that I have encountered it looked an
awful lot like an unrestricted goto. Some languages have
"exception" mechanisms, but it's usually been a fairly heavy-
weight construct, too expensive for routine use ("exceptional"
shouldn't be "routine").

However, C lacks such a facility. It follows that nearly
every loop with multiple termination conditions should be
followed promptly by a test; if there's no test, there's most
likely a bug.


The presence of this multiple termination in itself is a strong
hint that a dummy value is needed to simplify the loop condition,
followed by a simple test. In the case of the above it might have
been, with a suitable expansion of array[]:

while (somecondition) {
array[num] = suitablevalue; i = 0;
while (!check(array[i])) i++;
if (i < num) continue;
/* check(array[i]) must be false for all i < num */
}

--
A: Because it fouls the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
Nov 14 '05 #11
In <40***************@sun.com> Eric Sosman <Er*********@sun.com> writes:
This is an irksome weakness of C (and most other languages
I've programmed in): A loop with multiple termination conditions
gives no direct evidence of which condition actually caused it
to terminate. You find yourself at the statement after the loop
with no notion of how you got there, and you usually wind up
either re-testing a condition already tested (ick) or testing
a state flag that summarizes the earlier test (ick again).

Some languages have an "alternate exit" formalism for such
situations, but in most that I have encountered it looked an
awful lot like an unrestricted goto. Some languages have
"exception" mechanisms, but it's usually been a fairly heavy-
weight construct, too expensive for routine use ("exceptional"
shouldn't be "routine").

However, C lacks such a facility.
Nope, it's called goto.
It follows that nearly
every loop with multiple termination conditions should be
followed promptly by a test; if there's no test, there's most
likely a bug.


I have yet to understand this irrational fear of using goto where it is
called for (i.e. any place where it helps simplifying the code structure).

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #12
Dan Pop wrote:

In <40***************@sun.com> Eric Sosman <Er*********@sun.com> writes:
This is an irksome weakness of C (and most other languages
I've programmed in): A loop with multiple termination conditions
gives no direct evidence of which condition actually caused it
to terminate. You find yourself at the statement after the loop
with no notion of how you got there, and you usually wind up
either re-testing a condition already tested (ick) or testing
a state flag that summarizes the earlier test (ick again).

Some languages have an "alternate exit" formalism for such
situations, but in most that I have encountered it looked an
awful lot like an unrestricted goto. Some languages have
"exception" mechanisms, but it's usually been a fairly heavy-
weight construct, too expensive for routine use ("exceptional"
shouldn't be "routine").

However, C lacks such a facility.


Nope, it's called goto.


Perhaps my use of "such a facility" was unclear. I meant
it to mean "a facility that solves the problem posed in the
first paragraph without the objections mentioned in the second."
It follows that nearly
every loop with multiple termination conditions should be
followed promptly by a test; if there's no test, there's most
likely a bug.


I have yet to understand this irrational fear of using goto where it is
called for (i.e. any place where it helps simplifying the code structure).


My fear of goto is not irrational, but some consider it
unnatural: there is disagreement about whether zero is a
natural number. My first programming language was FORTRAN II,
which would probably have driven me to another field of endeavor
had I suffered from gotophobia.

--
Er*********@sun.com
Nov 14 '05 #13
In <40***************@sun.com> Eric Sosman <Er*********@sun.com> writes:
natural number. My first programming language was FORTRAN II,
which would probably have driven me to another field of endeavor
had I suffered from gotophobia.


Back then, gotophobia had not been yet invented ;-)

I started with FORTRAN IV, followed by several assembly languages: having
gotos with symbolic destinations (instead of F-IV's 1 to 5 digits) was
like being in programmers' heaven... (but I had to do all the assemblying
by hand, having no access to any kind of hosted platform...).

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #14
Dan Pop <Da*****@cern.ch> scribbled the following:
In <40***************@sun.com> Eric Sosman <Er*********@sun.com> writes:
natural number. My first programming language was FORTRAN II,
which would probably have driven me to another field of endeavor
had I suffered from gotophobia.
Back then, gotophobia had not been yet invented ;-) I started with FORTRAN IV, followed by several assembly languages: having
gotos with symbolic destinations (instead of F-IV's 1 to 5 digits) was
like being in programmers' heaven... (but I had to do all the assemblying
by hand, having no access to any kind of hosted platform...).


Right on the money, Dan. I started programming on C=64 BASIC V2, which
is an extremely cut-down version of BASIC by 1980s standards. Back then
GOTO with a symbolic label rather than a hard-coded line number as the
destination was like science fiction...

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"'So called' means: 'There is a long explanation for this, but I have no
time to explain it here.'"
- JIPsoft
Nov 14 '05 #15
Joona I Palaste a écrit :
Dan Pop <Da*****@cern.ch> scribbled the following:
In <40***************@sun.com> Eric Sosman <Er*********@sun.com> writes:
natural number. My first programming language was FORTRAN II,
which would probably have driven me to another field of endeavor
had I suffered from gotophobia.


Back then, gotophobia had not been yet invented ;-)


I started with FORTRAN IV, followed by several assembly languages: having
gotos with symbolic destinations (instead of F-IV's 1 to 5 digits) was
like being in programmers' heaven... (but I had to do all the assemblying
by hand, having no access to any kind of hosted platform...).

Right on the money, Dan. I started programming on C=64 BASIC V2, which
is an extremely cut-down version of BASIC by 1980s standards. Back then
GOTO with a symbolic label rather than a hard-coded line number as the
destination was like science fiction...


Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds.

¹ Go To Statement Considered Harmful. Communications of the ACM, Vol.
11, No. 3, March 1968, pp. 147-148. Available here :
http://www.acm.org/classics/oct95/

--
Richard
Nov 14 '05 #16
Richard Delorme <ab****@nospam.fr> scribbled the following:
Joona I Palaste a écrit :
Dan Pop <Da*****@cern.ch> scribbled the following:
In <40***************@sun.com> Eric Sosman <Er*********@sun.com> writes:
natural number. My first programming language was FORTRAN II,
which would probably have driven me to another field of endeavor
had I suffered from gotophobia.
Back then, gotophobia had not been yet invented ;-)

I started with FORTRAN IV, followed by several assembly languages: having
gotos with symbolic destinations (instead of F-IV's 1 to 5 digits) was
like being in programmers' heaven... (but I had to do all the assemblying
by hand, having no access to any kind of hosted platform...).


Right on the money, Dan. I started programming on C=64 BASIC V2, which
is an extremely cut-down version of BASIC by 1980s standards. Back then
GOTO with a symbolic label rather than a hard-coded line number as the
destination was like science fiction...

Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds. ¹ Go To Statement Considered Harmful. Communications of the ACM, Vol.
11, No. 3, March 1968, pp. 147-148. Available here :
http://www.acm.org/classics/oct95/


Yes, but what were you going to do about it? C-64 Basic v2 had pretty
much the bare minimum of control structures. IF... THEN was only for
single lines, not blocks of lines, and there was no ELSE. The only loop
available was FOR... NEXT. Every other kind of loop had to be simulated
with IF... THEN GOTO. And like I said, GOTOs had to have hard-coded line
numbers as destinations. There was GOSUB... RETURN which I think could
be used to have "poor man's subroutines", because they behaved
otherwise like subroutines but had no concept of local variable scope.
So it's not like you could write the kind of structured programming
Dijkstra would have wanted in C-64 Basic v2.
I have never programmed in Fortran, but I think it was about the same
situation with the Fortran version Dan Pop grew up with.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"O pointy birds, O pointy-pointy. Anoint my head, anointy-nointy."
- Dr. Michael Hfuhruhurr
Nov 14 '05 #17
Joona I Palaste a écrit :
Richard Delorme <ab****@nospam.fr> scribbled the following:
Back then, gotophobia had not been yet invented ;-) [...]I started programming on C=64 BASIC V2, which
is an extremely cut-down version of BASIC by 1980s standards.
[...] what were you going to do about it?


When reading the two previous sentences, it sounds to me that
"gotophobia" has been invented past the 1980s. This is why I tried to
restablish the chronology.

--
Richard
Nov 14 '05 #18
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds.


The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros. One of the most popular
micros of the early eighties had 8k of ROM and 1k of RAM in its standard
configuration. Creative minds managed amazing feats on this
configuration.

By the time BASIC moved to relatively resource-rich PC's, it also acquired
structured programming features (and even dropped the line numbers that
were supposed to replace the need for a "sophisticated" text editor).

Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #19
In <c6**********@oravannahka.helsinki.fi> Joona I Palaste <pa*****@cc.helsinki.fi> writes:
I have never programmed in Fortran, but I think it was about the same
situation with the Fortran version Dan Pop grew up with.


Nope, FORTRAN IV was much better than BASIC and the "new" logical IF
statements were a great improvement over FORTRAN II's arithmetic IF
statements (which were gotos with 3 destinations).

Each program unit (main unit, functions and subroutines) had its own
local variables and it could be made to "see" only the global variables
it needed to see.

Its main drawbacks, that survived until F90, were the fixed-form line
format with purely numeric labels (characters beyond column 72 were
silently ignored, to provide additional fun during debugging) and blanks
being *completely* ignored in the statement field (starting in column 7),
except inside "string" literals. Hence the (in)famous typo:

DO 5 I = 1. 5

(the period was supposed to be a comma) that led to the statement being
parsed as:

DO5I = 1.5

i.e. a loop was replaced by an assignment to a bogus variable and the body
of the loop was executed once instead of 5 times. This happened in a NASA
program, but the consequences were minimal (a certain entity was evaluated
with less precision than intended).

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #20
Chris Torek wrote:
[...]
Errors are often a great deal easier to spot after all the irrelevant
distractions have been removed. :-)

It is also easy to read what *should* have been written, rather
than what was actually written. This is particular true if the
debugging is being done by the original programmer.


Your brain knows what the code is supposed to be doing, and so you
"see" code that should be behaving. It's not uncommon for someone
to be debugging their own code for hours, only to call someone else
over who spots the error in under 30 seconds. (I've been on both
sides of that scenario.)

--
+-------------------------+--------------------+-----------------------------+
| Kenneth J. Brody | www.hvcomputer.com | |
| kenbrody at spamcop.net | www.fptech.com | #include <std_disclaimer.h> |
+-------------------------+--------------------+-----------------------------+
Nov 14 '05 #21
Joona I Palaste wrote:
for(i=0;i<num;i++)
if(check(array[i]))
break;
if(check(array[i]))
continue;


As soon as I saw that code alarm bells went off my head.


Ditto. Bad code. Not interesting. :-|

--
|_ CJSonnack <Ch***@Sonnack.com> _____________| How's my programming? |
|_ http://www.Sonnack.com/ ___________________| Call: 1-800-DEV-NULL |
|_____________________________________________|___ ____________________|
Nov 14 '05 #22
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:

Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds.

The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros. One of the most popular
micros of the early eighties had 8k of ROM and 1k of RAM in its standard
configuration. Creative minds managed amazing feats on this
configuration.


The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.
By the time BASIC moved to relatively resource-rich PC's, it also acquired
structured programming features (and even dropped the line numbers that
were supposed to replace the need for a "sophisticated" text editor).
According to Thomas E. Kurtz (co-inventor of BASIC), line number
suppression and structured programming appeared in their Dartmouth BASIC
around 1975, in order to avoid "spaghetti code":
http://www.truebasic.com/downloads/D2001.pdf
Of course, in 1975, no resource-rich PC was available.
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.


Certainly, but it is quite clear to me that Dijkstra's paper changed
Kurtz & Kemeny minds so that they made their Dartmouth BASIC a fully
structured language, and that was several years before the release of
C-64 BASIC v2.

--
Richard
Nov 14 '05 #23
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Richard Delorme wrote:
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme
<ab****@nospam.fr> writes:

Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is
forgivable, whereas C-64 Basic v2 (and many other Basics of this
time) appeared when Dijkstra's paper had irremediably changed minds.


The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros. One of the most popular
micros of the early eighties had 8k of ROM and 1k of RAM in its standard
configuration. Creative minds managed amazing feats on this
configuration.

The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.
By the time BASIC moved to relatively resource-rich PC's, it also
acquired
structured programming features (and even dropped the line numbers that
were supposed to replace the need for a "sophisticated" text editor).

According to Thomas E. Kurtz (co-inventor of BASIC), line number
suppression and structured programming appeared in their Dartmouth BASIC
around 1975, in order to avoid "spaghetti code":
http://www.truebasic.com/downloads/D2001.pdf
Of course, in 1975, no resource-rich PC was available.
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.

Certainly, but it is quite clear to me that Dijkstra's paper changed
Kurtz & Kemeny minds so that they made their Dartmouth BASIC a fully
structured language, and that was several years before the release of
C-64 BASIC v2.


Although Dijkstra and K&K were influencial in the field of programming at the
academic level, I believe that Dr. Dobbs was more influential on the formation
of C-64 Basic than they were.

Remember, until the Peoples Computer Company sponsored their "Tiny Basic"
contest (spawning "Doctor Dobb's Journal of Computer Calesthenics and
Orthodontia") there was /no/ BASIC for microcomputer. Since the source code for
about three or four "Tiny Basic"s were published by Dr. Dobbs, with the licence
to install them, and enhance them, from their authors, many (I'd dare say
'most') fuller-function microcomputer BASICs were derived from those humble and
restricted beginnings.
- --
Lew Pitcher
IT Consultant, Enterprise Application Architecture,
Enterprise Technology Solutions, TD Bank Financial Group

(Opinions expressed are my own, not my employers')
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (MingW32)

iD8DBQFAiCSUagVFX4UWr64RAivrAJwMnxssqPjE/CERdnn7oQvZMNSXHgCgowaW
ORZdpx1jUt/ytUQxQIad5gA=
=Tyhx
-----END PGP SIGNATURE-----
Nov 14 '05 #24
Richard Delorme <ab****@nospam.fr> scribbled the following:
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds.
The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros. One of the most popular
micros of the early eighties had 8k of ROM and 1k of RAM in its standard
configuration. Creative minds managed amazing feats on this
configuration.

The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.


So what caused the language to be so rudimentary?
For another thing, when the Commodore 64 was released, it had by far
the best graphics and sound capabilities on the market - if only people
could get to actually *use* them. The included BASIC did nothing
whatsoever to support anything beyond simple text output. You couldn't
even change the background colour, only the text colour.
To actually make use of the graphics and sound, you had to have
knowledge of how the VIC and SID chips operated and how they were
interfaced to various hard-coded registers in the 64-kilobyte memory
space. In practice this meant a lot of code consisting solely of
seemingly obscure POKE commands and PEEK functions. This code was
basically a BASICised form of machine code.
What caused this? Why didn't the BASIC have in-built graphics and sound
support? Many other competing machines did have it. Was Commodore in a
rush to get the machine released so they did not have the time?

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"Holy Banana of this, Sacred Coconut of that, Magic Axolotl of the other."
- Guardian in "Jinxter"
Nov 14 '05 #25
Kenneth Brody <ke******@spamcop.net> writes:
Chris Torek wrote:
[...]
Errors are often a great deal easier to spot after all the irrelevant
distractions have been removed. :-)

It is also easy to read what *should* have been written, rather
than what was actually written. This is particular true if the
debugging is being done by the original programmer.


Your brain knows what the code is supposed to be doing, and so you
"see" code that should be behaving. It's not uncommon for someone
to be debugging their own code for hours, only to call someone else
over who spots the error in under 30 seconds. (I've been on both
sides of that scenario.)


And sometimes explaining the code to someone else can help you find
the bug, even if the person you're explaining it to doesn't understand
a word you're saying.

I've read about a college computer lab that kept a toy stuffed bear
(or something similar). Students trying to figure out why their
programs were failing were expected to explain their code to the bear
before asking one of the lab assistants. Most of the time, talking to
the bear was enough to track down the problem.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
Schroedinger does Shakespeare: "To be *and* not to be"
Nov 14 '05 #26
Joona I Palaste wrote:
...Commodore 64...best graphics and sound capabilities on the market
Used intelligent "peripherals".
The included BASIC did nothing whatsoever to support anything beyond
simple text output. You couldn't even change the background colour,
only the text colour.
Are you sure about that? That doesn't match my memory.
Why didn't the BASIC have in-built graphics and sound support?


Wasn't there a SOUND() function and wasn't there Sprite support
at the BASIC level?

Or am I thinking of the C128 (I used both)?

--
|_ CJSonnack <Ch***@Sonnack.com> _____________| How's my programming? |
|_ http://www.Sonnack.com/ ___________________| Call: 1-800-DEV-NULL |
|_____________________________________________|___ ____________________|
Nov 14 '05 #27
Da*****@cern.ch (Dan Pop) writes:
In <c6**********@oravannahka.helsinki.fi> Joona I Palaste
<pa*****@cc.helsinki.fi> writes:
I have never programmed in Fortran, but I think it was about the same
situation with the Fortran version Dan Pop grew up with.


Nope, FORTRAN IV was much better than BASIC and the "new" logical IF
statements were a great improvement over FORTRAN II's arithmetic IF
statements (which were gotos with 3 destinations).

Each program unit (main unit, functions and subroutines) had its own
local variables and it could be made to "see" only the global variables
it needed to see.

Its main drawbacks, that survived until F90, were the fixed-form line
format with purely numeric labels (characters beyond column 72 were
silently ignored, to provide additional fun during debugging) and blanks
being *completely* ignored in the statement field (starting in column 7),
except inside "string" literals. Hence the (in)famous typo:

DO 5 I = 1. 5

(the period was supposed to be a comma) that led to the statement being
parsed as:

DO5I = 1.5

i.e. a loop was replaced by an assignment to a bogus variable and the body
of the loop was executed once instead of 5 times. This happened in a NASA
program, but the consequences were minimal (a certain entity was evaluated
with less precision than intended).


This is a classic bug story, about which a great deal of
misinformation has been propagated over the years. There's a good
explanation of this bug (and the unrelated programming glitch that led
to the loss of the Mariner 1 Venus probe) in an article posted to
comp.lang.fortran back in 1994.

http://groups.google.com/groups?hl=e...40news.cern.ch

http://tinyurl.com/32fmf

The article was posted by some guy named "Dan Pop".

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
Schroedinger does Shakespeare: "To be *and* not to be"
Nov 14 '05 #28
Joona I Palaste <pa*****@cc.helsinki.fi> wrote:
Right on the money, Dan. I started programming on C=64 BASIC V2, which
is an extremely cut-down version of BASIC by 1980s standards. Back then
GOTO with a symbolic label rather than a hard-coded line number as the
destination was like science fiction...


Yes, but what were you going to do about it? C-64 Basic v2 had pretty
much the bare minimum of control structures. IF... THEN was only for
single lines, not blocks of lines, and there was no ELSE. The only loop
available was FOR... NEXT. Every other kind of loop had to be simulated
with IF... THEN GOTO. And like I said, GOTOs had to have hard-coded line
numbers as destinations. There was GOSUB... RETURN which I think could
be used to have "poor man's subroutines", because they behaved
otherwise like subroutines but had no concept of local variable scope.


Don't forget the BASIC equivalent of "switch":

240 GOTO 250*(i=0) + 350*(i=1) + 380*(i=2) + .... + 3530*(i=51)

(this came from an adventure game where there were 50 or so possible
user actions, each having its own handling "function").
Luckily my micro's version of BASIC included the feature of expressions
evaluating to either 1 or 0 (pretty advanced for those days); the other
micros had to do something more complicated.
I'm sure these program listings had to be generated by someone working
in a real environment with some tool that outputs micro-suitable
BASIC code, in real life it would take forever to re-number a program
and the programs were usually perfectly-numbered.
Nov 14 '05 #29
Chris Sonnack <Ch***@sonnack.com> scribbled the following:
Joona I Palaste wrote:
The included BASIC did nothing whatsoever to support anything beyond
simple text output. You couldn't even change the background colour,
only the text colour.
Are you sure about that? That doesn't match my memory.
You could change the text colour by using PRINT to print various
control characters, which you could easily type by hand. To change the
background colour, you needed to POKE into the VIC's control registers.
Why didn't the BASIC have in-built graphics and sound support?

Wasn't there a SOUND() function and wasn't there Sprite support
at the BASIC level?
No.
Or am I thinking of the C128 (I used both)?


You are.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"To doo bee doo bee doo."
- Frank Sinatra
Nov 14 '05 #30
Old Wolf <ol*****@inspire.net.nz> scribbled the following:
Joona I Palaste <pa*****@cc.helsinki.fi> wrote:
>> Right on the money, Dan. I started programming on C=64 BASIC V2, which
>> is an extremely cut-down version of BASIC by 1980s standards. Back then
>> GOTO with a symbolic label rather than a hard-coded line number as the
>> destination was like science fiction...
Yes, but what were you going to do about it? C-64 Basic v2 had pretty
much the bare minimum of control structures. IF... THEN was only for
single lines, not blocks of lines, and there was no ELSE. The only loop
available was FOR... NEXT. Every other kind of loop had to be simulated
with IF... THEN GOTO. And like I said, GOTOs had to have hard-coded line
numbers as destinations. There was GOSUB... RETURN which I think could
be used to have "poor man's subroutines", because they behaved
otherwise like subroutines but had no concept of local variable scope.

Don't forget the BASIC equivalent of "switch": 240 GOTO 250*(i=0) + 350*(i=1) + 380*(i=2) + .... + 3530*(i=51) (this came from an adventure game where there were 50 or so possible
user actions, each having its own handling "function").
Luckily my micro's version of BASIC included the feature of expressions
evaluating to either 1 or 0 (pretty advanced for those days); the other
micros had to do something more complicated.
I'm sure these program listings had to be generated by someone working
in a real environment with some tool that outputs micro-suitable
BASIC code, in real life it would take forever to re-number a program
and the programs were usually perfectly-numbered.


On a Commodore 64, that GOTO statement will cause an "undef'd
statement" error if the variable i has any value from 0 to 51. This is
because unlike your micro's BASIC, the Commodore 64 BASIC evaluates
true expressions to -1, and thus if i has some value from 0 to 51,
the above GOTO will attempt to jump to a negative line number. It
should read:

240 GOTO -250*(i=0) - 350*(i=1) - 380*(i=2) - ... - 3550*(i=51)

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"I wish someone we knew would die so we could leave them flowers."
- A 6-year-old girl, upon seeing flowers in a cemetery
Nov 14 '05 #31
In article <c6**********@sunnews.cern.ch> Da*****@cern.ch (Dan Pop) writes:
The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros.


That may have been the purpose of many later BASIC implementations, it
was not the purpose of the original BASIC. At that time the problem was
not limited resources on 8-bit micros, but limited resource on mainframes.
--
dik t. winter, cwi, kruislaan 413, 1098 sj amsterdam, nederland, +31205924131
home: bovenover 215, 1025 jn amsterdam, nederland; http://www.cwi.nl/~dik/
Nov 14 '05 #32
Chris Sonnack <Ch***@Sonnack.com> wrote:
Joona I Palaste wrote:
...Commodore 64...best graphics and sound capabilities on the market
Used intelligent "peripherals".
The included BASIC did nothing whatsoever to support anything beyond
simple text output. You couldn't even change the background colour,
only the text colour.


Are you sure about that? That doesn't match my memory.
Why didn't the BASIC have in-built graphics and sound support?


Wasn't there a SOUND() function and wasn't there Sprite support
at the BASIC level?


There simply was no space for it in the ROM. The C64 used bank
switching for the memory mapping:

64K RAM
$0000-$7fff 32K
$8000-$9fff 8K - Expansion slot
$a000-$bfff 8K - System ROM (Basic interpreter) - Expansion slot
$c000-$cfff 4K
$d000-$dfff 4K - IO (2xCIA, SID, VIC, 1Kx4 static color RAM)
$e000-$ffff 8K - System ROM (I/O)

The switching was controlled by 3 of the 6 I/O-lines of the 6510 CPU.
The other 3 bits were used for the Datasette tape recorder. 2 of the
CIA's I/O-lines controlled which of the 4 16K-portions of the RAM was
used for text, graphics and sprite data; the VIC-registers controlled
the exact location inside this 16K portion. The space for basic
programs was from $0800-$9fff in the standard configuration (still an
awful lot in comparision to other home computers).

Inside the 16K ROM there was anything: Datasette control, serial I/O
for peripherals, the complete basic interpreter, etc. It was used up
to the last bit. Basic extensions were usually made in three ways: In
hardware by using the expansion slot and losing 8K of RAM; by copying
the basic ROM into the RAM and modifying it; or by modifying one of
the indirect jump pointers heavily used for I/O and interpretation,
repointing it to the own routines.

---snip--------- What caused this? Why didn't the BASIC have in-built graphics and sound
support? Many other competing machines did have it. Was Commodore in a
rush to get the machine released so they did not have the time?

---snap---------

For a rush the complete thing was too bug-free. The strength of it was
that nearly *any* possibility given by the hardware was used in the
design, completely offering it to the competent programmer - not to
mention the internal "possibilities"/bugs of the I/O chips like the
frame sprites which were discovered quite lately. Not including
graphics and sound support into the Basic was IMHO a good idea - it
would have resulted in too much trade-offs.
Holger
Nov 14 '05 #33
Kenneth Brody <ke******@spamcop.net> wrote:
Chris Torek wrote:
It is also easy to read what *should* have been written, rather
than what was actually written. This is particular true if the
debugging is being done by the original programmer.


Your brain knows what the code is supposed to be doing, and so you
"see" code that should be behaving. It's not uncommon for someone
to be debugging their own code for hours, only to call someone else
over who spots the error in under 30 seconds. (I've been on both
sides of that scenario.)


This is not just true in computing, btw. I work for a publisher, and in
that business it is a well-known trope that it is all but impossible to
correct your own newspaper articles.

Richard
Nov 14 '05 #34
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:

Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds.

The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros. One of the most popular
micros of the early eighties had 8k of ROM and 1k of RAM in its standard
configuration. Creative minds managed amazing feats on this
configuration.


The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.


Maybe, maybe not. The 20K of ROM were probably holding plenty of code
that had nothing to do with the BASIC interpreter itself. There is also
the execution speed issue: GOTO lineno is much faster to implement in an
*interpreter* than scanning for the matching ELSE or reverse scanning to
find the beginning of the loop. Sure, such things can be accelerated,
by giving up the pure interpreter model, but this requires even more
resources.

The PC implementations supporting structured programming features
were typically incremental compilers...

Then, there is the marketing issue. The vast majority of C64's were not
supposed to be programmed in BASIC (or any other language), they were
supposed to be used as game machines. Games implemented in assembly.
Due to that, certain, game-oriented hadware features of the machine were
not even properly supported by the BASIC interpreter. So, the vendor
correctly (from HIS point of view) decided not to invest too much
resources into a sophisticated BASIC implementation where a simple minded
one could serve equally well.

And another argument, which I'm sure was ignored by Commodore, was that
the kids learning programming on such machines (the few seeing in the C64
more than a game machine) were better served by a very simple minded BASIC
along the lines of the original Kemenyi and Kurtz design, than by
something more sophisticated and, therefore, more difficult to grasp by
elementary school kids. I'm sure the really talented kids, who found
the limitations of the builtin BASIC annoying, could find better
alternatives.
By the time BASIC moved to relatively resource-rich PC's, it also acquired
structured programming features (and even dropped the line numbers that
were supposed to replace the need for a "sophisticated" text editor).


According to Thomas E. Kurtz (co-inventor of BASIC), line number
suppression and structured programming appeared in their Dartmouth BASIC
around 1975, in order to avoid "spaghetti code":
http://www.truebasic.com/downloads/D2001.pdf
Of course, in 1975, no resource-rich PC was available.


So, they probably used resource-rich mainframes, so I fail to see your
point.
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.


Certainly, but it is quite clear to me that Dijkstra's paper changed
Kurtz & Kemeny minds so that they made their Dartmouth BASIC a fully
structured language, and that was several years before the release of
C-64 BASIC v2.


So what? Are you suggesting that the resources of the C64 were comparable
to those of a mainframe from the mid-seventies?

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #35
Dan Pop <Da*****@cern.ch> scribbled the following:
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Gotophobia started in the sixties. Dijkstra's paper¹ was published in
1968 and refered to older papers published in 1966. Fortran II and IV
appeared respectively in 1958 and 1962 so their ignorance is forgivable,
whereas C-64 Basic v2 (and many other Basics of this time) appeared when
Dijkstra's paper had irremediably changed minds.

The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros. One of the most popular
micros of the early eighties had 8k of ROM and 1k of RAM in its standard
configuration. Creative minds managed amazing feats on this
configuration.
The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.

Maybe, maybe not. The 20K of ROM were probably holding plenty of code
that had nothing to do with the BASIC interpreter itself. There is also
the execution speed issue: GOTO lineno is much faster to implement in an
*interpreter* than scanning for the matching ELSE or reverse scanning to
find the beginning of the loop. Sure, such things can be accelerated,
by giving up the pure interpreter model, but this requires even more
resources.
A fully interpreted BASIC that scanned the actual source code as the
program progressed caused lots of interesting things. For example,
short line numbers actually performed better than long ones. Also
variables were faster than literal numbers. This was obvious, really,
as scanning a literal number required a decimal-to-binary conversion
but scanning a variable only required a table lookup. Thus several
magazines actually recommended replacing uses of 0 and 1 (which were
the most common numbers) with uses of variables that were assigned
these values and never reassigned.

The Commodore 64 actually used line numbers in a weird fashion. If we
ignore GOTO and GOSUB, line numbers were pretty much irrelevant.
Execution normally proceeded in the order the lines were in memory. If
you edited the memory directly, you could for example put the lines in
reverse order, or even give every line the same number, and the program
still ran like it originally did. (If it didn't use GOTO and GOSUB, of
course.)
The PC implementations supporting structured programming features
were typically incremental compilers... Then, there is the marketing issue. The vast majority of C64's were not
supposed to be programmed in BASIC (or any other language), they were
supposed to be used as game machines. Games implemented in assembly.
Due to that, certain, game-oriented hadware features of the machine were
not even properly supported by the BASIC interpreter. So, the vendor
correctly (from HIS point of view) decided not to invest too much
resources into a sophisticated BASIC implementation where a simple minded
one could serve equally well. And another argument, which I'm sure was ignored by Commodore, was that
the kids learning programming on such machines (the few seeing in the C64
more than a game machine) were better served by a very simple minded BASIC
along the lines of the original Kemenyi and Kurtz design, than by
something more sophisticated and, therefore, more difficult to grasp by
elementary school kids. I'm sure the really talented kids, who found
the limitations of the builtin BASIC annoying, could find better
alternatives.


Well, now we don't have that problem. Kids these days won't know a
programming language from a hole in the ground. What they want their
computer to do is access the latest "hip" chat sites and play the
latest action-packed shoot-'em-ups. Nothing that would actually require
creative thinking - heck, thinking at all.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"I said 'play as you've never played before', not 'play as IF you've never
played before'!"
- Andy Capp
Nov 14 '05 #36
In <c6**********@oravannahka.helsinki.fi> Joona I Palaste <pa*****@cc.helsinki.fi> writes:
What caused this? Why didn't the BASIC have in-built graphics and sound
support? Many other competing machines did have it. Was Commodore in a
rush to get the machine released so they did not have the time?


Most likely, Commodore didn't see BASIC programming as relevant to the
marketing of a machine, which was, obviously, hardware optimised to
be used as a game console. However, since it was sold as a home computer
(another marketing trick: parents objecting to buying a game console
could be convinced to buy a home computer ;-) it had to have a BASIC
interpreter.

Compare to the ZX-Spectrum, that was poorly equipped as a game console,
but whose BASIC supported all the hardware features of the machine,
including the high resolution graphics and the sound generator (an I/O
port bit directly turning on and off the membrane of a loudspeaker).
It came with a very nice BASIC manual documenting everything (including
low level hardware and software implementation details) and a cassette
with demo BASIC programs (making minimal use of machine code to improve
the program's user interface). It's obvious that Clive Sinclair seriously
considered the educational usage of the machine, apart from its
recreational one (the machine also came with a very short user guide,
explaining how to connect it to the TV and tape cassette recorder and how
to load "programs" from tape).

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #37
In <Hw********@cwi.nl> "Dik T. Winter" <Di********@cwi.nl> writes:
In article <c6**********@sunnews.cern.ch> Da*****@cern.ch (Dan Pop) writes:
The main purpose of many BASIC implementations was to make the best use
of the *limited* resources of 8-bit micros.
That may have been the purpose of many later BASIC implementations,


BASIC became really popular about 15 years after its original release.
Apparently, Bill Gates is responsible for that to a significant degree.
it
was not the purpose of the original BASIC. At that time the problem was
not limited resources on 8-bit micros, but limited resource on mainframes.


Or even machines lesser than mainframes, that could be used to serve
several terminals used for interactive BASIC programming. FOCAL was
an even more primitive-looking language, used for interactive programming
on the resource starved PDP-8 (4096 12-bit words).

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #38
In article <c6**********@oravannahka.helsinki.fi> Joona I Palaste <pa*****@cc.helsinki.fi> writes:
Dan Pop <Da*****@cern.ch> scribbled the following:

....
The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.

Maybe, maybe not. The 20K of ROM were probably holding plenty of code
that had nothing to do with the BASIC interpreter itself. There is also
the execution speed issue: GOTO lineno is much faster to implement in an
*interpreter* than scanning for the matching ELSE or reverse scanning to
find the beginning of the loop. Sure, such things can be accelerated,
by giving up the pure interpreter model, but this requires even more
resources.


A fully interpreted BASIC that scanned the actual source code as the
program progressed caused lots of interesting things.


There is actually such a beast in the obfuscated c contest archives. It
is a C program of (I think) less than 2000 bytes.
--
dik t. winter, cwi, kruislaan 413, 1098 sj amsterdam, nederland, +31205924131
home: bovenover 215, 1025 jn amsterdam, nederland; http://www.cwi.nl/~dik/
Nov 14 '05 #39
In article <c6***********@sunnews.cern.ch> Da*****@cern.ch (Dan Pop) writes:
In <Hw********@cwi.nl> "Dik T. Winter" <Di********@cwi.nl> writes:

....
it
was not the purpose of the original BASIC. At that time the problem was
not limited resources on 8-bit micros, but limited resource on mainframes.


Or even machines lesser than mainframes, that could be used to serve
several terminals used for interactive BASIC programming. FOCAL was
an even more primitive-looking language, used for interactive programming
on the resource starved PDP-8 (4096 12-bit words).


Mainframes also were pretty resource starved. The first mainframe I used
had 32768 27-bit words, of which only one half could be used for code.
And that was the major mainframe serving two universities and the research
centre I worked at (and am still working at). Later this was changed to
a mainframe with 32768 60-bit words that could easily handle close to 100
interactive users, all doing Fortran, Algol 60, Algol 68 and Pascal
compilations and running editors and programs. But of course, your user
space was limited to 16384 words. And, eh, *no* virtual memory.
--
dik t. winter, cwi, kruislaan 413, 1098 sj amsterdam, nederland, +31205924131
home: bovenover 215, 1025 jn amsterdam, nederland; http://www.cwi.nl/~dik/
Nov 14 '05 #40
Joona I Palaste wrote:
And another argument, which I'm sure was ignored by Commodore, was that
the kids learning programming on such machines (the few seeing in the C64
more than a game machine) were better served by a very simple minded BASIC
along the lines of the original Kemenyi and Kurtz design, than by
something more sophisticated...


Well, now we don't have that problem. Kids these days won't know a
programming language from a hole in the ground. What they want their
computer to do is access the latest "hip" chat sites and play the
latest action-packed shoot-'em-ups. Nothing that would actually require
creative thinking - heck, thinking at all.


As a teacher of undergraduates, I'm sorry to say that I can confirm
that. In the DOS days, I had a scattering of students who really
knew about computers. In the XP days, I have none.

Allin Cottrell
Nov 14 '05 #41
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.
Maybe, maybe not. The 20K of ROM were probably holding plenty of code
that had nothing to do with the BASIC interpreter itself. There is also
the execution speed issue: GOTO lineno is much faster to implement in an
*interpreter* than scanning for the matching ELSE or reverse scanning to
find the beginning of the loop. Sure, such things can be accelerated,
by giving up the pure interpreter model, but this requires even more
resources.


I don't think that technical considerations were really important.

The PC implementations supporting structured programming features
were typically incremental compilers...

Then, there is the marketing issue. The vast majority of C64's were not
supposed to be programmed in BASIC (or any other language), they were
supposed to be used as game machines. Games implemented in assembly.
Due to that, certain, game-oriented hadware features of the machine were
not even properly supported by the BASIC interpreter. So, the vendor
correctly (from HIS point of view) decided not to invest too much
resources into a sophisticated BASIC implementation where a simple minded
one could serve equally well.
That's the main point. The lake of competition, the availability of a
simple and inexpensive BASIC made by Microsoft for their platforms, and
other marketing issues can well explain the choices made by Commodore.
So, whatever if standardized specifications for a better BASIC existed
and the machine was powerful enough to handle them, they did not bother
implementing it.
I'm sure the really talented kids, who found
the limitations of the builtin BASIC annoying, could find better
alternatives.


Yes, here is an impressively long list of available languages for
commodore-64:
http://www.npsnet.com/danf/cbm/languages.html
The availability of structured languages (BASIC included) also shows
that the commodore resources were relatively sufficient to handle them.
According to Thomas E. Kurtz (co-inventor of BASIC), line number
suppression and structured programming appeared in their Dartmouth BASIC
around 1975, in order to avoid "spaghetti code":
http://www.truebasic.com/downloads/D2001.pdf
Of course, in 1975, no resource-rich PC was available.


So, they probably used resource-rich mainframes, so I fail to see your
point.


A GE-635 (1966-1975) followed by an Honeywell 66/40 (in 1976). Their
resources were quite comparable to a commodore-64, and were shared among
many users (up to 200).
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.


Certainly, but it is quite clear to me that Dijkstra's paper changed
Kurtz & Kemeny minds so that they made their Dartmouth BASIC a fully
structured language, and that was several years before the release of
C-64 BASIC v2.


So what? Are you suggesting that the resources of the C64 were comparable
to those of a mainframe from the mid-seventies?


Probably, but that's not the problem. I just want to show that BASIC
could have been a well-structured language on the family and personal
computer of the early 1980s.
I think Kurtz and Kemeny shared a similar opinion when they decided in
1983 to create a company to make "available to everyone a high-quality
BASIC", as demonstrated in this Kurtz' interview:

(from http://www.truebasic.com/downloads/D2006.pdf)
Q: How did Dartmouth BASIC become True BASIC?
A: Teaching at Dartmouth, John Kemeny and I were shielded from some of
the worst implementations of BASIC. For example, we stopped using line
numbers in 1975, just as personal computers were being invented.
Dartmouth BASIC had continued to evolve into a more and more
sophisticated language that was a joy to use. However, in 1983, three
Dartmouth alums challenged us to look at the versions of BASIC that were
out there, all different on the different computers. We were appalled at
how terrible these crude `street' versions were, and what high school
and college students and teachers had to contend with. We knew that
writing papers or delivering talks would have little effect so we
accepted the challenge of forming an independent commercial software
publishing company and making available to everyone a high-quality
BASIC, one that reflected our years of teaching experience.

--
Richard
Nov 14 '05 #42
Da*****@cern.ch (Dan Pop) wrote in message news:<c6***********@sunnews.cern.ch>...
In <c6**********@oravannahka.helsinki.fi> Joona I Palaste <pa*****@cc.helsinki.fi> writes:
What caused this? Why didn't the BASIC have in-built graphics and sound
support? Many other competing machines did have it. Was Commodore in a
rush to get the machine released so they did not have the time?


Most likely, Commodore didn't see BASIC programming as relevant to the
marketing of a machine, which was, obviously, hardware optimised to
be used as a game console. However, since it was sold as a home computer
(another marketing trick: parents objecting to buying a game console
could be convinced to buy a home computer ;-) it had to have a BASIC
interpreter.

Compare to the ZX-Spectrum, that was poorly equipped as a game console,
but whose BASIC supported all the hardware features of the machine,
including the high resolution graphics and the sound generator (an I/O
port bit directly turning on and off the membrane of a loudspeaker).
It came with a very nice BASIC manual documenting everything (including
low level hardware and software implementation details) and a cassette
with demo BASIC programs (making minimal use of machine code to improve
the program's user interface). It's obvious that Clive Sinclair seriously
considered the educational usage of the machine, apart from its
recreational one (the machine also came with a very short user guide,
explaining how to connect it to the TV and tape cassette recorder and how
to load "programs" from tape).

Dan

I grew up with an Amstrad 464, a 64K Z80 based thing no-one really
remembers.

Like the Commodore 64 it was more for games, but the BASIC interpreter
was still good, although slow. It also supported some things that
were rather like threads, strangely enough. The BASIC manual was a
work of genius, especially the graphical programs.
Nov 14 '05 #43
In article <c6**********@sunnews.cern.ch>, Da*****@cern.ch says...
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.


I think that if his paper had said something like "harmful for newbies" in
the title, it might have been far more appropriate overall.

--
Randy Howard
2reply remove FOOBAR

Nov 14 '05 #44
In article <c6***********@f1n1.spenet.wfu.edu>, co******@wfu.edu says...
As a teacher of undergraduates, I'm sorry to say that I can confirm
that. In the DOS days, I had a scattering of students who really
knew about computers. In the XP days, I have none.


Hmm, I have a 10-year old child currently learning to program in C
on an XP system. We'll be working on cross-platform portability to
Linux and Mac systems (the latter primarily for byte-order discussions)
shortly.

--
Randy Howard
2reply remove FOOBAR

Nov 14 '05 #45
In article <MP************************@news.verizon.net>,
Randy Howard <ra*********@FOOverizonBAR.net> wrote:
In article <c6**********@sunnews.cern.ch>, Da*****@cern.ch says...
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.


I think that if his paper had said something like "harmful for newbies" in
the title, it might have been far more appropriate overall.


Also remember that this paper was written at a time when many
programmers _had to use_ goto because they used a language that had
nothing better. Just rewrite a switch statement with five cases using
goto only, and you see that the code becomes a mess. So "goto is
harmful" didn't mean primarily that goto should be removed, first of all
it meant that programming languages should have better constructs than a
plain goto.
Nov 14 '05 #46
Randy Howard <ra*********@fooverizonbar.net> scribbled the following:
In article <c6***********@f1n1.spenet.wfu.edu>, co******@wfu.edu says...
As a teacher of undergraduates, I'm sorry to say that I can confirm
that. In the DOS days, I had a scattering of students who really
knew about computers. In the XP days, I have none.
Hmm, I have a 10-year old child currently learning to program in C
on an XP system. We'll be working on cross-platform portability to
Linux and Mac systems (the latter primarily for byte-order discussions)
shortly.


But *you're* a computer programmer too. Neither of my parents, nor any
of my elementary school teachers, knew anything about computers when I
started programming. I started programming from natural interest, not
because someone taught me.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"Outside of a dog, a book is a man's best friend. Inside a dog, it's too dark
to read anyway."
- Groucho Marx
Nov 14 '05 #47
ro***********@antenova.com (Rob Thorpe) wrote:
I grew up with an Amstrad 464, a 64K Z80 based thing no-one really
remembers.
I remember it for the wrong reasons: when Amstrad bought out Sinclair,
they immediately ceased production of Sinclair machines and concentrated
on the 464.
Like the Commodore 64 it was more for games, but the BASIC interpreter
was still good, although slow. It also supported some things that
were rather like threads, strangely enough. The BASIC manual was a
work of genius, especially the graphical programs.


There was a demo of these things at a mall; each PC had a book of
programs with it, and you could go in, type out a program for 2 minutes,
and see all these amazing graphical effects. Good marketing :)
Nov 14 '05 #48
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Dan Pop a écrit :
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
The commodore-64 had 20K of ROM and 64K of RAM, so the memory resource
was not that much limited.
Maybe, maybe not. The 20K of ROM were probably holding plenty of code
that had nothing to do with the BASIC interpreter itself. There is also
the execution speed issue: GOTO lineno is much faster to implement in an
*interpreter* than scanning for the matching ELSE or reverse scanning to
find the beginning of the loop. Sure, such things can be accelerated,
by giving up the pure interpreter model, but this requires even more
resources.


I don't think that technical considerations were really important.


Hard to say...
The PC implementations supporting structured programming features
were typically incremental compilers...

Then, there is the marketing issue. The vast majority of C64's were not
supposed to be programmed in BASIC (or any other language), they were
supposed to be used as game machines. Games implemented in assembly.
Due to that, certain, game-oriented hadware features of the machine were
not even properly supported by the BASIC interpreter. So, the vendor
correctly (from HIS point of view) decided not to invest too much
resources into a sophisticated BASIC implementation where a simple minded
one could serve equally well.


That's the main point. The lake of competition, the availability of a
simple and inexpensive BASIC made by Microsoft for their platforms, and
other marketing issues can well explain the choices made by Commodore.
So, whatever if standardized specifications for a better BASIC existed
and the machine was powerful enough to handle them, they did not bother
implementing it.


Name one 8-bit home computer implementing the "better" BASIC
specification in its ROM. Even those which didn't go the Microsoft
BASIC way, still took the KnK specification from 1964 as their
starting point.
According to Thomas E. Kurtz (co-inventor of BASIC), line number
suppression and structured programming appeared in their Dartmouth BASIC
around 1975, in order to avoid "spaghetti code":
http://www.truebasic.com/downloads/D2001.pdf
Of course, in 1975, no resource-rich PC was available.


So, they probably used resource-rich mainframes, so I fail to see your
point.


A GE-635 (1966-1975) followed by an Honeywell 66/40 (in 1976). Their
resources were quite comparable to a commodore-64, and were shared among
many users (up to 200).


Equating a mainframe and a home computer is ridiculous. Even if they
had comparable amounts of central memory (which I strongly doubt), the
mainframe had a very powerful I/O system that could compensate the lack of
memory. Now, tell us about the C64 I/O system...
Then again, no matter how many minds Dijkstra's paper might have changed,
there is plenty of proof that well structured code can be written with
gotos and badly structured code without. The tool might help, but it
cannot replace the skill of the craftsman.

Certainly, but it is quite clear to me that Dijkstra's paper changed
Kurtz & Kemeny minds so that they made their Dartmouth BASIC a fully
structured language, and that was several years before the release of
C-64 BASIC v2.


So what? Are you suggesting that the resources of the C64 were comparable
to those of a mainframe from the mid-seventies?


Probably, but that's not the problem.


It's pretty much part of the problem.
I just want to show that BASIC
could have been a well-structured language on the family and personal
computer of the early 1980s.


But you have provided exactly zilch arguments supporting this opinion.
The existence of a language specification is far from being enough for
that purpose. Write an implementation that could fit in 16 K, along with
all the device drivers and the rest of the system management software
usually required by an 8-bit home computer and that would deliver decent
performance on a 4 MHz Z80A (or equivalent micro of the time) and THEN
you'd have a *valid* point.

The fact that it took a mainframe or a PC to produce such implementations
might provide a clue....

Dan
--
Dan Pop
DESY Zeuthen, RZ group
Email: Da*****@ifh.de
Nov 14 '05 #49
Dan Pop <Da*****@cern.ch> scribbled the following:
In <40**********************@news.club-internet.fr> Richard Delorme <ab****@nospam.fr> writes:
Dan Pop a écrit :
So, they probably used resource-rich mainframes, so I fail to see your
point.


A GE-635 (1966-1975) followed by an Honeywell 66/40 (in 1976). Their
resources were quite comparable to a commodore-64, and were shared among
many users (up to 200).

Equating a mainframe and a home computer is ridiculous. Even if they
had comparable amounts of central memory (which I strongly doubt), the
mainframe had a very powerful I/O system that could compensate the lack of
memory. Now, tell us about the C64 I/O system...


Hmm, the old 1541 disk drive used to take 10-15 minutes to load about
20-40 kilobytes of data from disk. And the cassette drives, they were
even slower...
When I switched to an Amiga 500, the disk drive speeds of several tens
of kilobytes per second were like a wonder.

--
/-- Joona Palaste (pa*****@cc.helsinki.fi) ------------- Finland --------\
\-- http://www.helsinki.fi/~palaste --------------------- rules! --------/
"The day Microsoft makes something that doesn't suck is probably the day they
start making vacuum cleaners."
- Ernst Jan Plugge
Nov 14 '05 #50

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

8 posts views Thread by Bruno R. Dias | last post: by
12 posts views Thread by rawrite | last post: by
3 posts views Thread by Johnny Lee | last post: by
8 posts views Thread by Mythran | last post: by
28 posts views Thread by v4vijayakumar | last post: by
126 posts views Thread by jacob navia | last post: by
4 posts views Thread by timer | last post: by
reply views Thread by leo001 | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.