473,320 Members | 1,799 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,320 software developers and data experts.

Learning C

I am sorry if this is an inappropriate place to put this post, if so
please delete it.

I am wondering about a few things. Do you guys recommend learning C as
a second language, as someone who already knows java very well. And
what is the best way to learn C, books, tutorials, or what?

Thanks, any response would be great.

Mar 10 '06 #1
26 2910
"mf*******@gmail.com" <mf*******@gmail.com> writes:
I am sorry if this is an inappropriate place to put this post, if so
please delete it.
Well, this is an entirely appropriate place for this post, but if it
had not been, deleting the post would be impossible (once you post to
USENET, it stays there, for the most part. Even "cancelling" a post
you made will may affect some but not all servers).
I am wondering about a few things. Do you guys recommend learning C as
a second language, as someone who already knows java very well. And
what is the best way to learn C, books, tutorials, or what?

Thanks, any response would be great.


I'd be willing to bet that most of the folks on this group began
learning C with a book. Unfortunately, there is a very high ratio of
crappy C books to useful C books. I strongly recommend you buy a copy
of K&R C ("The C Programming Language", by Kernighan and Ritchie, 2nd
edition), written by the actual inventors of C. The C language has
actually changed since then, but the language described there is still
the most portable version of the language, and that book is the most
widely recognized

When you've worked through that, you should probably buy a copy of "C:
A Reference Manual" by Harbison & Steele.

See also http://c-faq.com/resources/books.html, which gives a few
pointers on finding good books on C.

Participating in this newsgroup, or on alt.comp.lang.learn.c-c++, can
also be /very/ beneficial, /provided/:

1. You thoroughly familiarize yourself with the
rules-of-conduct/posting for the group. This can be accomplished
by reading about 2 weeks' worth of posts before posting
yourself. See what kinds of posts get good responses, and which
get poor ones.

2. You have the humility and good grace to receive constructive
criticisms, both of your code and of any errant behavior on the
group.

HTH,
Micah
Mar 10 '06 #2
Thanks alot for the response, a few minutes after posting I searched
this group and found hundreds of other threads about the same topic. I
apoligize as well for double posting, that was a mistake even though
I'm sure it must look like I tryed to bump my thread.

I wanted as well, to bring forth another concern that I have with
programming in general. Many people say that ACTUALLY PROGRAMMING
real-world programs is the best way to get better. That theory makes
sense, but when I sit down to write a program in java, I often realize
that I have mastered the sytnax and searching and sorting algorithms,
yet I cannot apply this to a real-world program. Are there any books
for C or websites that explain the analytical thinking or other tactics
involved in practical uses of C?

Mar 10 '06 #3
mf*******@gmail.com said:
Are there any books
for C or websites that explain the analytical thinking or other tactics
involved in practical uses of C?


Your best starting point is the book Micah mentioned. It has a bunch of
exercises scattered throughout each chapter. Do them. It's astoundingly
good training for practical C use. As for analytical thinking, you could do
worse than stick around in comp.lang.c - some of the regular contributors
are fine exponents of the art. (See also Knuth's "The Art of Computer
Programming" - all three volumes - and don't be put off by the mathemagical
flavour.)

--
Richard Heathfield
"Usenet is a strange place" - dmr 29/7/1999
http://www.cpax.org.uk
email: rjh at above domain (but drop the www, obviously)
Mar 10 '06 #4
I hear it is very important to use a linux machine for c programming,
is this just a common myth or is there any truth there. I have been
considering switching to linux for this reason, but I'm not sure if
it's worth it..?

Mar 10 '06 #5
"mf*******@gmail.com" wrote:

Thanks alot for the response, a few minutes after posting I searched
this group and found hundreds of other threads about the same topic. I
apoligize as well for double posting, that was a mistake even though
I'm sure it must look like I tryed to bump my thread.

I wanted as well, to bring forth another concern that I have with
programming in general. Many people say that ACTUALLY PROGRAMMING
real-world programs is the best way to get better. That theory makes
sense, but when I sit down to write a program in java, I often realize
that I have mastered the sytnax and searching and sorting algorithms,
yet I cannot apply this to a real-world program. Are there any books
for C or websites that explain the analytical thinking or other tactics
involved in practical uses of C?


Our own Richard Heathfield has published such a book in "C
Unleashed". So have Kernighan & Pike "The Practice of
Programming". One of the best is Wirths "Algorithms + Data
Structures = Programs".

Get in the immediate habit of including adequate context in any
replies. Don't let the fact that you are posting through that
abortion of a usenet interface offered by google faze you. Read my
sig. below, and read the referenced URLs. Google IS NOT usenet.

All usenet articles need to stand by themselves. There is no
guarantee that any particular other article has ever reached, or
ever will reach, the reader. Even if it has, it may have been
deleted, or just be awkward to view. So quote adequately, and snip
quoted material that is irrelevant to your reply. Do not top-post,
your answer belongs after (or intermixed with) the material to
which you reply.

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>

Mar 10 '06 #6
mb1471 said:
I hear it is very important to use a linux machine for c programming,
is this just a common myth or is there any truth there.
It's a common myth - and like many myths, it has a huge amount of truth
behind it.

C is a portable language. Any box you can find a C compiler (or
cross-compiler) for, you can use for C programming. And if you write your
code carefully, very often the only thing you have to do to get your
program working on a different system is to recompile on the new system.
So, for example, you write your Widgetalyser on a Mac, get it working, copy
it over to a mainframe, recompile, and hey, presto! It works on the
mainframe too. And, one recompile later, it works on Linux too. And maybe
even on your mobile phone or microwave oven. (Er - maybe.)
I have been
considering switching to linux for this reason, but I'm not sure if
it's worth it..?


Oh, it's worth it all right. But perhaps you should put in a couple of years
on a Windows box first, just so that you'll be *really grateful* when you
change over. ;-)

Seriously, I don't like Windows (having used it for many years), I do like
Linux (having used it for a few years), but C doesn't care. You can get
good C compilers for Windows easily enough, legally for free in many cases.
The only thing to watch - er, one of a few things to watch is that you need
to be very firm with your typical Windows development environment. Be sure
to save your files with a .c extension, not a .cpp extension. Typical
Windows compilers will see a .cpp extension as an invitation to use C++
rules instead of C rules when compiling - with, as the saying goes, the
usual hilarious results.

--
Richard Heathfield
"Usenet is a strange place" - dmr 29/7/1999
http://www.cpax.org.uk
email: rjh at above domain (but drop the www, obviously)
Mar 10 '06 #7
Richard G. Riley wrote:
"mf*******@gmail.com"posted the following on 2006-03-10:

Thanks alot for the response, a few minutes after posting I searched
this group and found hundreds of other threads about the same topic. I
apoligize as well for double posting, that was a mistake even though
I'm sure it must look like I tryed to bump my thread.

I wanted as well, to bring forth another concern that I have with
programming in general. Many people say that ACTUALLY PROGRAMMING
real-world programs is the best way to get better. That theory makes
sense, but when I sit down to write a program in java, I often realize
that I have mastered the sytnax and searching and sorting algorithms,
yet I cannot apply this to a real-world program. Are there any books
for C or websites that explain the analytical thinking or other tactics
involved in practical uses of C?


One great way is to analyse existing systems. You mentioned moving to
Linux : this is a great idea because you can compile and step through
existing apps with the debugger.


could we keep the platform bigotry down to a dull roar? Windows has
perfectly good debuggers available as well. So if you really want to
(I've never had the urge) you can "step through existing apps".

I've developed serious applications on both Linux and Windows. They are

both perfectly adequate. It is completly possible to develop
applications
on windows. For heavens sake Windows was *written* in C!

<snip>
--
Nick Keighley

Mar 10 '06 #8
On 2006-03-10, Richard Heathfield <in*****@invalid.invalid> wrote:
mb1471 said:
I hear it is very important to use a linux machine for c programming,
is this just a common myth or is there any truth there.


It's a common myth - and like many myths, it has a huge amount of truth
behind it.


You forgot to mention what that truth is.

Linux, as a "unix-like" system, has a great deal of cultural heritage
that is a good "fit" for C, as C was originally developed for UNIX. This
can be useful [it's probably the type of system on which the implement-
ation of the standard functions is going to be most straightforward*],
but also dangerous [a lot of things that aren't really defined by the
standard are going to happen to work the way you'll assume they should]

*Note I wouldn't actually recommend trying to _read_ the glibc source,
though. If you seriously plan on doing that, get a unixy OS other than
linux. Say, FreeBSD. or Mac OSX. or SCO [well, that doesn't come with
source code, but i'm sure its source code is still more readable than
glibc even so]. really, anything.
Mar 10 '06 #9

Richard G. Riley wrote:
"Nick"posted the following on 2006-03-10:
Richard G. Riley wrote:
"mf*******@gmail.com"posted the following on 2006-03-10:

> Thanks alot for the response, a few minutes after posting I searched
> this group and found hundreds of other threads about the same topic. I
> apoligize as well for double posting, that was a mistake even though
> I'm sure it must look like I tryed to bump my thread.
>
> I wanted as well, to bring forth another concern that I have with
> programming in general. Many people say that ACTUALLY PROGRAMMING
> real-world programs is the best way to get better. That theory makes
> sense, but when I sit down to write a program in java, I often realize
> that I have mastered the sytnax and searching and sorting algorithms,
> yet I cannot apply this to a real-world program. Are there any books
> for C or websites that explain the analytical thinking or other tactics
> involved in practical uses of C?

One great way is to analyse existing systems. You mentioned moving to
Linux : this is a great idea because you can compile and step through
existing apps with the debugger.


could we keep the platform bigotry down to a dull roar? Windows has
perfectly good debuggers available as well. So if you really want to
(I've never had the urge) you can "step through existing apps".


No you cant. Firstly, I use multiple OSs : windows, OS/2 and Linux so
get away with your attempt to suggest this is an OS war bigotry.

Secondly, what system apps are you aware of in windows which come with
the C source code? Most are in C++ anyway. Most Gnome/Linux system is
in C. The OP was asking about C.


I thought you were implying Linux was better because the *debugger* was

better. I've never stepped through an existing application (that wasn't
broken)
with a debugger. If you say it's a good way to learn C, who am I to
argue.

But it *still* sounds bizzare to me

<snip>
--
Nick Keighley

Mar 10 '06 #10
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-10:
I thought you were implying Linux was better because the *debugger* was
There is no "the debugger" : although gdb is prevalent in Linux -
albeit with several front ends.
better. I've never stepped through an existing application (that wasn't
broken)
with a debugger. If you say it's a good way to learn C, who am I to
argue.


I said its one way to get used to the structure and flow of
applications which is what he wants. Also, I do think a debugger can
give real insight into how C works in the real world : results of
operators there for you to see with no overhead of printfs which some
favor.


The printf's are portable. The printf's work without manual intervention.
The printfs work without having to understand an additional tool.
But it *still* sounds bizzare to me


What does? Are we talking about the same thing? Do you doubt that
watching other, well written apps work is beneficial to a newbie? It
seems fairly clear to me that it can only help. Its how the entire
Open SW system works : people learning by doing and picking up on
other peoples work.


You've conflated "learning by doing and picking up on> other peoples work"
with "watching ... apps work" and that with stepping through an application
using a debugger. I think that's misleading.

--
Chris "sparqling" Dollin
"Who do you serve, and who do you trust?"
Mar 10 '06 #11

Chris Dollin wrote:
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-10:
I thought you were implying Linux was better because the *debugger* was


There is no "the debugger" : although gdb is prevalent in Linux -
albeit with several front ends.
< snip >
What does? Are we talking about the same thing? Do you doubt that
watching other, well written apps work is beneficial to a newbie? It
seems fairly clear to me that it can only help. Its how the entire
Open SW system works : people learning by doing and picking up on
other peoples work.


You've conflated "learning by doing and picking up on> other peoples work"
with "watching ... apps work" and that with stepping through an application
using a debugger. I think that's misleading.


You've been trolled!

Especially in the light of the reply by the same person I couldn't
avoid seeing, as I'm forced to use Google from the office.

--
BR, Vladimir

Mar 10 '06 #12

"Nick Keighley" <ni******************@hotmail.com> wrote in message
news:11**********************@j52g2000cwj.googlegr oups.com...

Richard G. Riley wrote:
"Nick"posted the following on 2006-03-10:
> Richard G. Riley wrote:
>> "mf*******@gmail.com"posted the following on 2006-03-10:
>
>> > Thanks alot for the response, a few minutes after posting I searched
>> > this group and found hundreds of other threads about the same topic.
>> > I
>> > apoligize as well for double posting, that was a mistake even though
>> > I'm sure it must look like I tryed to bump my thread.
>> >
>> > I wanted as well, to bring forth another concern that I have with
>> > programming in general. Many people say that ACTUALLY PROGRAMMING
>> > real-world programs is the best way to get better. That theory makes
>> > sense, but when I sit down to write a program in java, I often
>> > realize
>> > that I have mastered the sytnax and searching and sorting
>> > algorithms,
>> > yet I cannot apply this to a real-world program. Are there any books
>> > for C or websites that explain the analytical thinking or other
>> > tactics
>> > involved in practical uses of C?
>>
>> One great way is to analyse existing systems. You mentioned moving to
>> Linux : this is a great idea because you can compile and step through
>> existing apps with the debugger.
>
> could we keep the platform bigotry down to a dull roar? Windows has
> perfectly good debuggers available as well. So if you really want to
> (I've never had the urge) you can "step through existing apps".
No you cant. Firstly, I use multiple OSs : windows, OS/2 and Linux so
get away with your attempt to suggest this is an OS war bigotry.

Secondly, what system apps are you aware of in windows which come with
the C source code? Most are in C++ anyway. Most Gnome/Linux system is
in C. The OP was asking about C.


I thought you were implying Linux was better because the *debugger* was

better. I've never stepped through an existing application (that wasn't
broken)
with a debugger. If you say it's a good way to learn C, who am I to
argue.

But it *still* sounds bizzare to me


My $.02:

I learned C on MS-DOS systems. I found using a debugger
(even with 'working' code) to be an *extremely* good aid
in understanding the language and how it was implemented
on that system. E.g. when creating a new function, before
integrating it with the main program, I'd run it in a
'test harness' under a debugger, and more often than
not, I was able to find and fix bugs before they got
into the main application.

-Mike


<snip>
--
Nick Keighley

Mar 10 '06 #13
Mike Wahler wrote:

I learned C on MS-DOS systems. I found using a debugger
(even with 'working' code) to be an extremely good aid
in understanding the language and how it was implemented
on that system. E.g. when creating a new function, before
integrating it with the main program, I'd run it in a
'test harness' under a debugger, and more often than
not, I was able to find and fix bugs before they got
into the main application.

Sure, but that's not what was claimed. The original statement said
"existing programs", so not the one under development. I don't think
tracing someone else's code is likely to be much of a learning tool.

Personally I've never found that even reading other code was useful for
learning basic programming. It teaches one how to read code, which is a
useful skill in and of itself, but not how to program. Only writing
code teaches you that.

An experienced programmer may well look at existing code for tips on
how to approach a problem, but that's another thing altogether.

--
Please quote enough of the previous message for context. To do so from
Google, click "show options" and use the Reply shown in the expanded
header.
Mar 11 '06 #14

"Default User" <de***********@yahoo.com> wrote in message
news:47************@individual.net...
Mike Wahler wrote:

I learned C on MS-DOS systems. I found using a debugger
(even with 'working' code) to be an extremely good aid
in understanding the language and how it was implemented
on that system. E.g. when creating a new function, before
integrating it with the main program, I'd run it in a
'test harness' under a debugger, and more often than
not, I was able to find and fix bugs before they got
into the main application.

Sure, but that's not what was claimed.


OK, perhaps I misunderstood. I was simply stating
that I found a debugger to be useful while learning
the language.

The original statement said
"existing programs", so not the one under development. I don't think
tracing someone else's code is likely to be much of a learning tool.
I think it can help (provided it's 'decent' code).
Personally I've never found that even reading other code was useful for
learning basic programming.
I have, especially textbook examples. I suppose this is
just a case of YMMV. :-)
It teaches one how to read code, which is a
useful skill in and of itself, but not how to program. Only writing
code teaches you that.
I also think writing code is the probably the most useful way to learn.
My remarks about a debugger were in that context. "Write it, then
watch it."

An experienced programmer may well look at existing code for tips on
how to approach a problem, but that's another thing altogether.


-Mike
Mar 11 '06 #15
Mike Wahler wrote:

"Default User" <de***********@yahoo.com> wrote in message
news:47************@individual.net...
Personally I've never found that even reading other code was useful
for learning basic programming.


I have, especially textbook examples. I suppose this is
just a case of YMMV. :-)


I don't mean short illustrative code examples as found in textbooks.
Those have been, hopefully, crafted for teaching purposes. That's not
usually the case with a full-blown program.
I also think writing code is the probably the most useful way to
learn. My remarks about a debugger were in that context. "Write it,
then watch it."


I would agree.

Brian
Mar 11 '06 #16
Richard G. Riley wrote:
"Chris"posted the following on 2006-03-10:
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-10: I thought you were implying Linux was better because the *debugger* was
There is no "the debugger" : although gdb is prevalent in Linux -
albeit with several front ends.
"debugger" / "debuggers" whatever. I've used ddd.

better. I've never stepped through an existing application (that wasn't
broken)
with a debugger. If you say it's a good way to learn C, who am I to
argue.
I stress that I talking about "stepping through an application". I have
*no*
objection to examining existing code. One way to learn is to look at
good
examples (and sometimes at bad).

I said its one way to get used to the structure and flow of
applications which is what he wants. Also, I do think a debugger can
give real insight into how C works in the real world : results of
operators there for you to see with no overhead of printfs which some
favor.


The printf's are portable. The printf's work without manual
intervention.


No they dont : you have to insert them in the code.


yes, but you don't have to keep on inserting them. Debuggers are
generally manual.

But thats being petty.
I beg to differ.
Its rare that I find someone wanting their printfs to be
portable in a system process or a an X gui or a Win 32 winproc : they
dont work. Home grown or system supplied logging libraries possibly :
but can you really analyse them at run time? I cant : I like to step
through and see the flow of the app to get a feel for how the systems
heart is beating.
ok I'm not saying you are wrong. I'm just pointing out that not
everyone
works and learns the way you do. I don't single step debuggers to
examine other people's applications

The printfs work without having to understand an additional tool.
But it *still* sounds bizzare to me

What does?
using a debugger to examine existing applications

Are we talking about the same thing? Do you doubt that
watching other, well written apps work is beneficial to a newbie?
*examining* existing applications and other examples, yes.
*stepping* existing applications and other examples, no.

It
seems fairly clear to me that it can only help. Its how the entire
Open SW system works : people learning by doing and picking up on
other peoples work.

right. BUT NOT IN A DEBUGGER

You've conflated "learning by doing and picking up on> other peoples work"
with "watching ... apps work" and that with stepping through an application
using a debugger. I think that's misleading.


Really? Seems pretty straightforward to me and also how close on 100% of
Universities teach coding at some stage or other :


really? *all* universities encourage the use of stepping?
Do you have statistics?

Could you go back and *read* what I and other posters have actually
been saying?

adding modules to existing systems.
yes! absoloutly!

Maybe I'm a bit slow today but I'm not seeing the
subtleties of the point you are making here :
well you don't "add to an existing module" by using a debugger.

the user is looking for
a way to learn how to structure applications and build them himself.
yes yes

I fail to see how analysing existing, successful apps can be anything
other than beneficial. It doesnt take away the donkey work of learning
the lanugage basics, but it can make text book "science" much more
accessible and "real". I cant imagine becoming a programmer without
such practice, guidance and "practical training". Its the same in all
walks of life.
I think I'll give up here. My point wasn't that important. Just trying
to
make the point that different people do thinks in different ways. Linux

is not the only platform. Not everyone uses debuggers the way you do
(some people only use them when they suspect a compiler error). A
newbie should be aware there are different ways to do things and not
lock themselves into a particular approach too early.

Maybe I'll even get my debugger out and step through a program
sometime to see if it brings me any insights. Maybe you should try
a "debugger free day" and try and see what you have to do to manage
without. Try reasoning about the program. Consider invariants and
post/pre conditions. Try adding asserts etc.

no way is the One True Way

"A desk is a dangerous place from which to view the world" - LeCarre.


"there is nothing as practical as a good theory" Lewin
--
Nick Keighley

Mar 11 '06 #17
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-11:
Richard G. Riley wrote:
"Chris"posted the following on 2006-03-10:
> Richard G. Riley wrote:
>> "Nick"posted the following on 2006-03-10: >>> I thought you were implying Linux was better because the *debugger* was
>
>> There is no "the debugger" : although gdb is prevalent in Linux -
>> albeit with several front ends.
"debugger" / "debuggers" whatever. I've used ddd.


ddd is a front end to gdb.


I know

>>> better. I've never stepped through an existing application (that wasn't
>>> broken)
>>> with a debugger. If you say it's a good way to learn C, who am I to
>>> argue.


I stress that I talking about "stepping through an application". I have
*no*
objection to examining existing code. One way to learn is to look at
good
examples (and sometimes at bad).


And how can stepping through an app be bad? Were you never put onto a
new project with a code base of several hundred thousand lines of code
and told to isolate some relatively bugs to get you familiar with the
code base?


about a year ago I was put on 750 kloc application. I had no previous
experience with the application. And limited experience with the
programming language. I did not step through the code with a debugger.
>> I said its one way to get used to the structure and flow of
>> applications which is what he wants. Also, I do think a debugger can
>> give real insight into how C works in the real world : results of
>> operators there for you to see with no overhead of printfs which some
>> favor.
>
> The printf's are portable. The printf's work without manual
>intervention.

No they dont : you have to insert them in the code.


yes, but you don't have to keep on inserting them. Debuggers are
generally manual.
<snip>
I meant me pointing out they are useless in most server based apps or
message driven GUI apps.
there are alternatives to printf(). I usually use some sort of logger.
For both
servers and GUIs. That 750kloc application has both.

Its rare that I find someone wanting their printfs to be
portable in a system process or a an X gui or a Win 32 winproc : they
dont work. Home grown or system supplied logging libraries possibly :
but can you really analyse them at run time? I cant : I like to step
through and see the flow of the app to get a feel for how the systems
heart is beating.


ok I'm not saying you are wrong. I'm just pointing out that not
everyone
works and learns the way you do. I don't single step debuggers to
examine other people's applications


Thats fine Nick : I dont expect you too - but you seem to have strong
reasons for making a point against it whereas I see *only* benefits.


ok, we disagree. What's wrong with *reading* the code. UML? Source
browsers?

> The printfs work without having to understand an additional tool.
>
>>> But it *still* sounds bizzare to me
>>
>> What does?


using a debugger to examine existing applications


We wont get into the pissing contest of who has worked on more
apps/platforms etc but I find it a good teaching tool to get people up
to speed with an app and its internal data structures : modify on the
fly, symbol tables etc. Cant do that with a print out or printfs.


I have *never* had any desire to do these things. Find the bug. Fix
the bug. Why would you want to modify things on-the-fly?!

>> Are we talking about the same thing? Do you doubt that
>> watching other, well written apps work is beneficial to a newbie?


*examining* existing applications and other examples, yes.
*stepping* existing applications and other examples, no.


for me "examining" is "stepping" : but obviously with some
strategically placed break points and a few retunr to callers etc :)


"examine", to me, means to study the source code. A good source
browser can be handy.

>> It
>> seems fairly clear to me that it can only help. Its how the entire
>> Open SW system works : people learning by doing and picking up on
>> other peoples work.


right. BUT NOT IN A DEBUGGER


Really : in a debugger. Do you really advocate printfs over a debugger
in a huge code base?


:-)

How do you map data values to their equivalent
constants? To me it sounds incredible. If you hadnt mentioned ddd I
would wonder if we are talking about the same thing.
some constants get dumped as numbers. Some enums are already wrappered
so as to be able to decode themselves (ok I admit this is C++, ...its
only a
*little bit* of a move to the Dark Side. I can stop any time I want).
> You've conflated "learning by doing and picking up on> other peoples work"
> with "watching ... apps work" and that with stepping through an application
> using a debugger. I think that's misleading.

Really? Seems pretty straightforward to me and also how close on 100% of
Universities teach coding at some stage or other :


really? *all* universities encourage the use of stepping?
Do you have statistics?


Maybe stepping is the key word here : it does not mean every line. It
means strategically placed break points and data analysis at those
points. It is an art.


and *all* universities teach it? Its a long time since I was at
university.

Yorkshire Man 2:
"Symbolic debuggers! You were lucky we 'ad t'punch t' cards with ar
teeth!"
Could you go back and *read* what I and other posters have actually
been saying?


I did. Some people seem to think that a debugger is just for finding
bugs : it is not. It is also useful for examining runtime trends. Far
more useful than hard to decipher printfs, especially in a non console mode.


but you have to know where to put the breaks. And that means you must
at
least partially understand the application. To do that I'd read the
code...
adding modules to existing systems.


yes! absoloutly!
Maybe I'm a bit slow today but I'm not seeing the
subtleties of the point you are making here :


well you don't "add to an existing module" by using a debugger


Of course you do. You use it to examine the data flow between the
system and your module.


well I add to an existing module with a Text Editor. I'm not being
funny
here we *really* seem to be missing each others point.

I REALLY don't use a debugger to add code to an existing module.
the user is looking for
a way to learn how to structure applications and build them himself.


yes yes
I fail to see how analysing existing, successful apps can be anything
other than beneficial. It doesnt take away the donkey work of learning
the lanugage basics, but it can make text book "science" much more
accessible and "real". I cant imagine becoming a programmer without
such practice, guidance and "practical training". Its the same in all
walks of life.


I think I'll give up here. My point wasn't that important. Just trying
to
make the point that different people do thinks in different ways. Linux

is not the only platform. Not everyone uses debuggers the way you do


Nobody said it was : you came flying in with that.


riight
The OP had
expressed an intrest in Linux and the fact that it comes with free
industry strength compilers and debuggers as well as hundreds of rock
solid C apps complete with build packages makes it a good bet for
someone to learn how to structure and analyse an application in
C. Windows can not compete with that IMO.
it's a perfectly good development platform though. Look Linux is ok.
Windows is ok. But two posters had encouraged the OP to move to
Linux. I'm not sure its such an open and shut decsision. You can
write crap non-portable code on Linux as well.
(some people only use them when they suspect a compiler error). A


If there is a compile error, you wont be using a debugger.


well compiler bugs are about as rare as hens teeth. But why not? The
debugger would show you that the program flow or calculation results
did not follow the source code. All my tools are broken if the source
code does not represent what the machine is doing.

To be honest I can't remember the last time I had a bug caused by a
compiler error
newbie should be aware there are different ways to do things and not
lock themselves into a particular approach too early.


A newbie needs to examine other peoples stuff : especially when
adding/fixing bugs. Approaches vary, but it can never hurt to get down
and dirty to be sure.

A newbie certainly shouldnt dedicate his life to printf()'s : they
simply do not work in a lot of environments, are inflexible, and only
show the data that you WANT to see, not the data you SHOULD see. This
is where a debuggers "locals" view etc comes in. You see alllocal
data. Not what someone thinks they need to see.
Maybe I'll even get my debugger out and step through a program
sometime to see if it brings me any insights. Maybe you should try
a "debugger free day" and try and see what you have to do to manage
without. Try reasoning about the program. Consider invariants and
post/pre conditions. Try adding asserts etc.


If you recall I did mention that a home brew logging system is
prefential to printf :


where did the words "homebrew logging system" appear in the paragraph
above? I did write a fairly blunt paragraph here. And then thought
better of it.
Do you know what an invarient is? Design by contract?

I have nothing against them and have
implemented many with various backend report generators to examine the
data.
good!
A debugger isnt the only tool : and I never said it was. What it
can do is allow you to see the flow of an application while watching
live data, allow you to modify that data
never felt the need. Not since compilers got fast enough to run during
the day.
and to examine and ensure
data typing is consistant from skillful use of register/pointer/memory
examinations : it is why they exist.
you examine registers? On a deeply embedded system, ok. But a server?
no way is the One True Way


And no where did I say it was. The whole crux here is you doubting
that stepping an existing app can help a user understand it : after
many, many and varied projects on various platforms in various
languages I find it incredulous how you could doubt this would be
beneficial. In order to even put in these printfs() you need some
understanding of whats going : that does not come from a print out all
the time. It does not some from a func spec which is not always
there. It does not come from holistic overview : it comes from
stepping through and examining what is going on. From finding out when
and why certain modules are called. From knowing what that little bit
of bit fiddling results in, from forcing a function call at runtime in
order to test a different calling parameter. The list goes on. Of
course none of this is "the only way" : but its a good way and one
thats been used a lot for years.


I'd say the same went (in spades) for stepping. You've got to know
where
to step. There's source code, case tools, source browsers, source code.
I've even resorted to grep to find callers of functions.
Do you need a debugger for a ten line string rversal func? Maybe not :
but I'll tell you a little secret, I'd still use one to test it
rigorously before handing it in to a system for an integration
test. Total extra effort? About 5 minutes.
I'd write unit test. Also five minutes work and more likely to give
proper
coverage.
I realise that there is a core here who seem to think a debugger is
almost evil : I sometimes question if they have used a real
debugger on a real system in the real world where programmers are
cycled on and off projects and people want to optimise their "up to
speed" figures.
normally I don't get into what I work on, but yes I do work on projects

where people get cycled on and off. I'm one of the cycled. It's a real
system. In the real world. Real people pay real money for the
facilities
it provides. And there is a requirement for more performance.

I'm not accusing you of it, but sometimes people seem to spend way
too much time hunched over a debugger when a bit of thought might
save them some time.
I realise that according to some posters some projects
get by without a debugger : I've never had the pleasure of such a
project/system.
"A desk is a dangerous place from which to view the world" - LeCarre.


"there is nothing as practical as a good theory" Lewin

--
Nick Keighley

Mar 11 '06 #18
On 2006-03-11, Nick Keighley <ni******************@hotmail.com> wrote:
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-11:
> Richard G. Riley wrote:
>> "Chris"posted the following on 2006-03-10:
>> > Richard G. Riley wrote:
>> >> "Nick"posted the following on 2006-03-10: >> >>> I thought you were implying Linux was better because the *debugger* was
>> >
>> >> There is no "the debugger" : although gdb is prevalent in Linux -
>> >> albeit with several front ends.
>
> "debugger" / "debuggers" whatever. I've used ddd.
ddd is a front end to gdb.


I know

>> >>> better. I've never stepped through an existing application (that wasn't
>> >>> broken)
>> >>> with a debugger. If you say it's a good way to learn C, who am I to
>> >>> argue.
>
> I stress that I talking about "stepping through an application". I have
> *no*
> objection to examining existing code. One way to learn is to look at
> good
> examples (and sometimes at bad).


And how can stepping through an app be bad? Were you never put onto a
new project with a code base of several hundred thousand lines of code
and told to isolate some relatively bugs to get you familiar with the
code base?


about a year ago I was put on 750 kloc application. I had no previous
experience with the application. And limited experience with the
programming language. I did not step through the code with a

debugger.

Your choice. I would have. Especially when I modified the code. But
again you are offering no evidence to suggest that stepping through an
app is "crazy" or "bizarre" as you originally claimed.
>> >> I said its one way to get used to the structure and flow of
>> >> applications which is what he wants. Also, I do think a debugger can
>> >> give real insight into how C works in the real world : results of
>> >> operators there for you to see with no overhead of printfs which some
>> >> favor.
>> >
>> > The printf's are portable. The printf's work without manual
>> >intervention.
>>
>> No they dont : you have to insert them in the code.
>
> yes, but you don't have to keep on inserting them. Debuggers are
> generally manual.
<snip>
I meant me pointing out they are useless in most server based apps or
message driven GUI apps.
there are alternatives to printf(). I usually use some sort of logger.
For both
servers and GUIs. That 750kloc application has both.


Fine. But often loggers affect the app. And again, you only see what
you are told you can see.. : a debugger lets you see what you want to
see when you want to see it. Being able to see the data greatly
simplifies understanding an application and I cant expect anyone to
dispute that. But someone will ... :-;

>> Its rare that I find someone wanting their printfs to be
>> portable in a system process or a an X gui or a Win 32 winproc : they
>> dont work. Home grown or system supplied logging libraries possibly :
>> but can you really analyse them at run time? I cant : I like to step
>> through and see the flow of the app to get a feel for how the systems
>> heart is beating.
>
> ok I'm not saying you are wrong. I'm just pointing out that not
> everyone
> works and learns the way you do. I don't single step debuggers to
> examine other people's applications
Thats fine Nick : I dont expect you too - but you seem to have strong
reasons for making a point against it whereas I see *only* benefits.


ok, we disagree. What's wrong with *reading* the code. UML? Source
browsers?


Because not all code is readable? Because not all code can be properly
deciphered from reading? Because you dont know what data is coming up
from those functions 40 deep? Becuase its plagues with missplaced
casts?

You dont know what branch conditions are going to be taken based on HW
port values? There are numerous reasons.


>> > The printfs work without having to understand an additional tool.
>> >
>> >>> But it *still* sounds bizzare to me
>> >>
>> >> What does?
>
> using a debugger to examine existing applications
We wont get into the pissing contest of who has worked on more
apps/platforms etc but I find it a good teaching tool to get people up
to speed with an app and its internal data structures : modify on the
fly, symbol tables etc. Cant do that with a print out or printfs.


I have *never* had any desire to do these things. Find the bug. Fix
the bug. Why would you want to modify things on-the-fly?!


Modify data to test or force a state. Its a basic of debugger use : you can
force your hand to avoid having to wait for a certain random data
suite to trigger the bug.

e.g someone says I'm getting a divide by zero somewhere near func().

go to func in debugger, force a zero into the divisor and see why the
checks arent stopping it. Simple example.

>> >> Are we talking about the same thing? Do you doubt that
>> >> watching other, well written apps work is beneficial to a newbie?
>
> *examining* existing applications and other examples, yes.
> *stepping* existing applications and other examples, no.
for me "examining" is "stepping" : but obviously with some
strategically placed break points and a few retunr to callers etc :)


"examine", to me, means to study the source code. A good source
browser can be handy.


A good debugger is that too.
>> >> It
>> >> seems fairly clear to me that it can only help. Its how the entire
>> >> Open SW system works : people learning by doing and picking up on
>> >> other peoples work.
>
> right. BUT NOT IN A DEBUGGER
Really : in a debugger. Do you really advocate printfs over a debugger
in a huge code base?


:-)

How do you map data values to their equivalent
constants? To me it sounds incredible. If you hadnt mentioned ddd I
would wonder if we are talking about the same thing.


some constants get dumped as numbers. Some enums are already wrappered
so as to be able to decode themselves (ok I admit this is C++, ...its
only a
*little bit* of a move to the Dark Side. I can stop any time I want).


I'm not good at remembering what 0xefecda means. I prefer something in
the debugger like "WM_GETFOCUS" or some such.
>> > You've conflated "learning by doing and picking up on> other peoples work"
>> > with "watching ... apps work" and that with stepping through an application
>> > using a debugger. I think that's misleading.
>>
>> Really? Seems pretty straightforward to me and also how close on 100% of
>> Universities teach coding at some stage or other :
>
> really? *all* universities encourage the use of stepping?
> Do you have statistics?


Maybe stepping is the key word here : it does not mean every line. It
means strategically placed break points and data analysis at those
points. It is an art.


and *all* universities teach it? Its a long time since I was at
university.


me too.
Yorkshire Man 2:
"Symbolic debuggers! You were lucky we 'ad t'punch t' cards with ar
teeth!"

:)
> Could you go back and *read* what I and other posters have actually
> been saying?


I did. Some people seem to think that a debugger is just for finding
bugs : it is not. It is also useful for examining runtime trends. Far
more useful than hard to decipher printfs, especially in a non console mode.


but you have to know where to put the breaks. And that means you must
at
least partially understand the application. To do that I'd read the
code...


So do I: of course. But i also analyse the flow at the same time : a
debugger does let you read the code you know :-;
>> adding modules to existing systems.
>
> yes! absoloutly!
>
>> Maybe I'm a bit slow today but I'm not seeing the
>> subtleties of the point you are making here :
>
> well you don't "add to an existing module" by using a debugger
Of course you do. You use it to examine the data flow between the
system and your module.


well I add to an existing module with a Text Editor. I'm not being
funny
here we *really* seem to be missing each others point.


No you dont. You use a text editor to create the symbolic equivalent
which is then compiled & linked into the bigger app. The debugger can then be
used to force data between your interfaces and the legacy very easily
for the purposes of quick integration testing. Yes, there are other
ways too.
I REALLY don't use a debugger to add code to an existing module.

No. Nor do I. I use it, frequently, to monitor the addition. No one
here is suggesting the debugger is a compiler & linker & editor. It is
for me part of the process. Not for all, but we're heading back down
that well covered track again :)
>> the user is looking for
>> a way to learn how to structure applications and build them himself.
>
> yes yes
>
>> I fail to see how analysing existing, successful apps can be anything
>> other than beneficial. It doesnt take away the donkey work of learning
>> the lanugage basics, but it can make text book "science" much more
>> accessible and "real". I cant imagine becoming a programmer without
>> such practice, guidance and "practical training". Its the same in all
>> walks of life.
>
> I think I'll give up here. My point wasn't that important. Just trying
> to
> make the point that different people do thinks in different ways. Linux
>
> is not the only platform. Not everyone uses debuggers the way you do


Nobody said it was : you came flying in with that.


riight


You did. Look at the thread. The OP mentioned Linux : I said it was a
good idea for what he wanted to do : you came in with "hold on hold on
lets not start an OS war here".
The OP had
expressed an intrest in Linux and the fact that it comes with free
industry strength compilers and debuggers as well as hundreds of rock
solid C apps complete with build packages makes it a good bet for
someone to learn how to structure and analyse an application in
C. Windows can not compete with that IMO.
it's a perfectly good development platform though. Look Linux is ok.
Windows is ok. But two posters had encouraged the OP to move to
Linux. I'm not sure its such an open and shut decsision. You can
write crap non-portable code on Linux as well.


But you dont have the facilities to support his learning as well IMO :
remember me mentioning the thousands of apps with source code? No one
is saying Linux is "better", but I would say its better for someone
wanting to get into C programming big time : everything is part of the
base install. For Free. And documented.
> (some people only use them when they suspect a compiler error). A
If there is a compile error, you wont be using a debugger.


well compiler bugs are about as rare as hens teeth. But why not? The
debugger would show you that the program flow or calculation results
did not follow the source code. All my tools are broken if the source
code does not represent what the machine is doing.

To be honest I can't remember the last time I had a bug caused by a
compiler error


Different things. I meant compilation error : sorry.
> newbie should be aware there are different ways to do things and not
> lock themselves into a particular approach too early.
A newbie needs to examine other peoples stuff : especially when
adding/fixing bugs. Approaches vary, but it can never hurt to get down
and dirty to be sure.

A newbie certainly shouldnt dedicate his life to printf()'s : they
simply do not work in a lot of environments, are inflexible, and only
show the data that you WANT to see, not the data you SHOULD see. This
is where a debuggers "locals" view etc comes in. You see alllocal
data. Not what someone thinks they need to see.
> Maybe I'll even get my debugger out and step through a program
> sometime to see if it brings me any insights. Maybe you should try
> a "debugger free day" and try and see what you have to do to manage
> without. Try reasoning about the program. Consider invariants and
> post/pre conditions. Try adding asserts etc.


If you recall I did mention that a home brew logging system is
prefential to printf :


where did the words "homebrew logging system" appear in the paragraph
above? I did write a fairly blunt paragraph here. And then thought
better of it.
Do you know what an invarient is? Design by contract?


I used those words. Its when you wrap whatever underlying logging system is
convenient to you in a fairly generic calling interface : so you could
log to files, text consoles, window systems whatever without changing
the calling code.
I have nothing against them and have
implemented many with various backend report generators to examine the
data.
good!
A debugger isnt the only tool : and I never said it was. What it
can do is allow you to see the flow of an application while watching
live data, allow you to modify that data


never felt the need. Not since compilers got fast enough to run during
the day.


I really dont understand this. What has that got to do with a debugger?
and to examine and ensure
data typing is consistant from skillful use of register/pointer/memory
examinations : it is why they exist.


you examine registers? On a deeply embedded system, ok. But a
server?
I also mentioned memory and pointers. And yes I do. Very useful in
debugging big C systems.
> no way is the One True Way
>
And no where did I say it was. The whole crux here is you doubting
that stepping an existing app can help a user understand it : after
many, many and varied projects on various platforms in various
languages I find it incredulous how you could doubt this would be
beneficial. In order to even put in these printfs() you need some
understanding of whats going : that does not come from a print out all
the time. It does not some from a func spec which is not always
there. It does not come from holistic overview : it comes from
stepping through and examining what is going on. From finding out when
and why certain modules are called. From knowing what that little bit
of bit fiddling results in, from forcing a function call at runtime in
order to test a different calling parameter. The list goes on. Of
course none of this is "the only way" : but its a good way and one
thats been used a lot for years.


I'd say the same went (in spades) for stepping. You've got to know
where
to step. There's source code, case tools, source browsers, source code.
I've even resorted to grep to find callers of functions.


grep is ok : if I dont have a decent IDE I use something like emacs
tags. I have used grep in about 15 years :)
Do you need a debugger for a ten line string rversal func? Maybe not :
but I'll tell you a little secret, I'd still use one to test it
rigorously before handing it in to a system for an integration
test. Total extra effort? About 5 minutes.


I'd write unit test. Also five minutes work and more likely to give
proper
coverage.
I realise that there is a core here who seem to think a debugger is
almost evil : I sometimes question if they have used a real
debugger on a real system in the real world where programmers are
cycled on and off projects and people want to optimise their "up to
speed" figures.


normally I don't get into what I work on, but yes I do work on projects

where people get cycled on and off. I'm one of the cycled. It's a real
system. In the real world. Real people pay real money for the
facilities
it provides. And there is a requirement for more performance.

I'm not accusing you of it, but sometimes people seem to spend way
too much time hunched over a debugger when a bit of thought might
save them some time.


No doubt. And equally the opposite applies : in my experience more so.
I realise that according to some posters some projects
get by without a debugger : I've never had the pleasure of such a
project/system.
>> "A desk is a dangerous place from which to view the world" - LeCarre.
>
> "there is nothing as practical as a good theory" Lewin


Mar 11 '06 #19
"Nick Keighley" <ni******************@hotmail.com> writes:
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-11:
> Richard G. Riley wrote:
>> "Chris"posted the following on 2006-03-10:
>> > Richard G. Riley wrote:
>> >> "Nick"posted the following on 2006-03-10: >> >>> I thought you were implying Linux was better because the
>> >>> *debugger* was
>> >
>> >> There is no "the debugger" : although gdb is prevalent in Linux -
>> >> albeit with several front ends.
>
> "debugger" / "debuggers" whatever. I've used ddd.


ddd is a front end to gdb.


I know
>> >>> better. I've never stepped through an existing application
>> >>> (that wasn't broken) with a debugger. If you say it's a good
>> >>> way to learn C, who am I to argue.
>
> I stress that I talking about "stepping through an application". I have
> *no*
> objection to examining existing code. One way to learn is to look at
> good
> examples (and sometimes at bad).


And how can stepping through an app be bad? Were you never put onto a
new project with a code base of several hundred thousand lines of code
and told to isolate some relatively bugs to get you familiar with the
code base?


about a year ago I was put on 750 kloc application. I had no previous
experience with the application. And limited experience with the
programming language. I did not step through the code with a debugger.

[...]

If I might summarize:

Richard G. Riley really likes debuggers. Not everyone else likes them
as much as Richard does.

I believe that covers all the relevant points.

Next?

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <*> <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Mar 11 '06 #20
On 2006-03-11, Keith Thompson <ks***@mib.org> wrote:

If I might summarize:

Richard G. Riley really likes debuggers. Not everyone else likes them
as much as Richard does.

I believe that covers all the relevant points.

Next?


Very droll.

But not at all what the crux of this is about : what its about is a
one way of working which can dramatically speed up ones familiarity
with a system. I find it terrifying that there seems to be a fair
number of programmers here who think a debugger is "only for finding bugs"
: it is a limited and naive view.

Inserting printfs, asserts and so forth is time consuming, messes up
the code, limits one to what the programmer wanted you to see at
"write time", and is a colossal source of heisenbugs. Unless you're
perfect that is : which some here have indeed claimed to be.

And for using such a tool to be called "bizarre" is bizarre in
itself. Yes, yes, I know what Kernighan said : good on him.

The entire thread started with discussing how a new programmer could
familiarise himself with bulding & architecting apps : I'm afraid that
"look at the source code" simply isnt a credible answer. One learns by
doing and examining a running system compiled with debug is one of the
most productive tools in the trainers arsenal.

Next?
Mar 11 '06 #21
On 11 Mar 2006 23:19:50 GMT, in comp.lang.c , "Richard G. Riley"
<rg****@gmail.com> wrote:
with a system. I find it terrifying that there seems to be a fair
number of programmers here who think a debugger is "only for finding bugs"
: it is a limited and naive view.


is this thread still going, or has my newsprovider regurgitated it?
Either way, i find it terrifying that someone thinks a debugger is for
programme analysis and testing. Yike.
Mark McIntyre
--
"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are,
by definition, not smart enough to debug it."
--Brian Kernighan

----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Mar 12 '06 #22
Keith Thompson wrote:
.... snip ...
If I might summarize:

Richard G. Riley really likes debuggers. Not everyone else likes
them as much as Richard does.

I believe that covers all the relevant points.


Which buggers?

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>
Mar 12 '06 #23
Richard G. Riley wrote:
On 2006-03-11, Keith Thompson <ks***@mib.org> wrote:
If I might summarize:

Richard G. Riley really likes debuggers. Not everyone else likes them
as much as Richard does.

I believe that covers all the relevant points.

Next?

Very droll.

But not at all what the crux of this is about : what its about is a
one way of working which can dramatically speed up ones familiarity
with a system. I find it terrifying that there seems to be a fair
number of programmers here who think a debugger is "only for finding bugs"
: it is a limited and naive view.

There is the middle ground, read the code first and step tough bits you
can't sus out in the debugger. A decent source browser (one that gives
you callers and callees) is a better tool for the job, it gives you an
overview of the code, rather than the details of one part of it.

Using a debugger as the main tool for familiarising one's self with a
large body of code is a bit like using a microscope to read a book.

--
Ian Collins.
Mar 12 '06 #24
Richard G. Riley wrote:
On 2006-03-11, Nick Keighley <ni******************@hotmail.com> wrote:
Richard G. Riley wrote:
"Nick"posted the following on 2006-03-11:
> Richard G. Riley wrote:
>> "Chris"posted the following on 2006-03-10:
>> > Richard G. Riley wrote:
>> >> "Nick"posted the following on 2006-03-10:
[ a long debate partly about using a debugger to analyse existing,
working
applications. Richard G. Riley thinks it's a good idea. I (Nick
Keighley) don't]

ok. This is *way* off topic. It has gone on too long. And the post has
grown
to gigantic proportions. I'm *really* going to stop now (I think I said
that
before...).

We are both probably frustratrated that the other doesn't seem to
understand our perefectly reasonable point of view. And everyone else
on
clc is probably even more frustrated that we both keep banging on.

What I do find odd is that we have drawn opposite conclusions for the
same
reason. Neither of us likes doing things manually that can be automated

(sign of a good programmer). So Richard objects to using printf() (or
custom
logger) because it is manual, error prone and may disturb the program,
and prefers using a debugger to achieve the same end. I object to the
use
of a debugger ***for exactly the same reasons*** !

<snip>
I find [a debugger] a good teaching tool to get people up
to speed with an app and its internal data structures : modify on the
fly, symbol tables etc. Cant do that with a print out or printfs.


I have *never* had any desire to do these things. Find the bug. Fix
the bug. Why would you want to modify things on-the-fly?!


Modify data to test or force a state. Its a basic of debugger use : you can
force your hand to avoid having to wait for a certain random data
suite to trigger the bug.

e.g someone says I'm getting a divide by zero somewhere near func().

go to func in debugger, force a zero into the divisor and see why the
checks arent stopping it. Simple example.


no. Put an assert() in. A modified variable is just for today. An
approriate
assert() lasts forever. Next year when someone casts doubt on the same
function, you simply look at the func() source code and say "nope,
can't
be a divide by zero because there's an assert() to check for that". Fix
it
once, never touch it again.

<snip>
>> I fail to see how analysing existing, successful apps can be anything
>> other than beneficial. It doesnt take away the donkey work of learning
>> the lanugage basics, but it can make text book "science" much more
>> accessible and "real". I cant imagine becoming a programmer without
>> such practice, guidance and "practical training". Its the same in all
>> walks of life.
>
> I think I'll give up here. My point wasn't that important. Just trying
> to make the point that different people do thinks in different ways. Linux
> is not the only platform. Not everyone uses debuggers the way you do

Nobody said it was : you came flying in with that.


riight


You did. Look at the thread. The OP mentioned Linux : I said it was a
good idea for what he wanted to do : you came in with "hold on hold on
lets not start an OS war here".


a debugger war is much more fun...

<snip>
> Maybe I'll even get my debugger out and step through a program
> sometime to see if it brings me any insights. Maybe you should try
> a "debugger free day" and try and see what you have to do to manage
> without. Try reasoning about the program. Consider invariants and
> post/pre conditions. Try adding asserts etc.

If you recall I did mention that a home brew logging system is
prefential to printf :


where did the words "homebrew logging system" appear in the paragraph
above? I did write a fairly blunt paragraph here. And then thought
better of it. Do you know what an invarient is? Design by contract?


I used those words. Its when you wrap whatever underlying logging system is
convenient to you in a fairly generic calling interface : so you could
log to files, text consoles, window systems whatever without changing
the calling code.


I note you didn't say if you knew what "Design by Contract" meant.

<snip>
A debugger isnt the only tool : and I never said it was. What it
can do is allow you to see the flow of an application while watching
live data, allow you to modify that data


never felt the need. Not since compilers got fast enough to run during
the day.


I really dont understand this. What has that got to do with a debugger?


I've modified variables using the debugger (the "debugger" actually had

toggle switches...) when the run time of the compiler was significant.
If it
took hours to recompile your code then machine code patches and
register
hacking was acceptable. It is no longer necessary (well, sometimes).

<snip>
and to examine and ensure
data typing is consistant from skillful use of register/pointer/memory
examinations : it is why they exist.


you examine registers? On a deeply embedded system, ok. But a
server?
I also mentioned memory and pointers. And yes I do. Very useful in
debugging big C systems.


wow. Culture Shockville. To my shame I couldn't even tell you what
registers
my platform has without STFWing. Ok that's a project for today.

<snip>
The whole crux here is you doubting
that stepping an existing app can help a user understand it : after
many, many and varied projects on various platforms in various
languages I find it incredulous how you could doubt this would be
beneficial. In order to even put in these printfs() you need some
understanding of whats going :

<snip>
I'd say the same went (in spades) for stepping. You've got to know
where
to step. There's source code, case tools, source browsers, source code.
I've even resorted to grep to find callers of functions.


grep is ok : if I dont have a decent IDE I use something like emacs
tags. I have used grep in about 15 years :)


ok it's a tool of desparation. You * never* use grep?!

<snip>
--
Nick Keighley

Testing can show the presense of bugs, but not their absence.
-- Dijkstra

Mar 12 '06 #25
On 2006-03-12, Ian Collins <ia******@hotmail.com> wrote:
Richard G. Riley wrote:
On 2006-03-11, Keith Thompson <ks***@mib.org> wrote:
If I might summarize:

Richard G. Riley really likes debuggers. Not everyone else likes them
as much as Richard does.

I believe that covers all the relevant points.

Next?
Very droll.

But not at all what the crux of this is about : what its about is a
one way of working which can dramatically speed up ones familiarity
with a system. I find it terrifying that there seems to be a fair
number of programmers here who think a debugger is "only for finding bugs"
: it is a limited and naive view.

There is the middle ground, read the code first and step tough bits you
can't sus out in the debugger. A decent source browser (one that

gives

100% of course. *Nothing* is as good as a good clear printout for an
overview and a familiarity. Hell, I print out man pages still.
you callers and callees) is a better tool for the job, it gives you an
overview of the code, rather than the details of one part of it.

Using a debugger as the main tool for familiarising one's self with a
large body of code is a bit like using a microscope to read a book.
I'm afraid that due to some of the replies I might have become a bit
exasperated and come across as a bit of a fanatic :(

But to close and to summarise some points which a lot of posters dont
seem to be aware of - or at least it appears that way from their
"debuggers are only occasional useful" replies -

1) A debugger can give you that high level overview too
2) It has cross reference facilities for definition finding
3) Ability to give symbolic representation to data
4) Ability to tweak at runtime to force unlikely conditions
5) Abilitiy to see entire dataset : not that determined useful by the
original programmer
6) Watchpoints & HW breakpoints to isolate required conditions.
7) Shows types : reveals bad casts & conversions very quickly in the
datawindow.
There are more. A good print out is good if for nothing else than
sitting on the jacks having a read :) Or annotation. The combo of a
good debugger with the experience of how to use it and possibly a
profiler is nirvana for a C programmer working on a large project.

I've seen old systems where the code is simply unreadable :
multistatements per line, single letter variable names, cast city :
code like this can be a nightmare to read from a printout - it is
useful to watch the program run. Yes, printfs* can be useful if
you have the knowledge and the facility to insert them.

(*) Or home brew multi-platform/interface compatible logging system
like Linux's klogd.

One last point about your microscope comment : this is only true if
you *step* every line. You dont. You place strategic break points at
the page or chapter you are interested in : after all, a C program is
hardly a novel - thought it is a schematic :)

And to anyone that got this far that isnt totally bored by the whole
thing : consider this. You see a function in a source browser :
are you not interested in HOW it was called? Why it was called? You can
guess from 30000 pages of printout or from a call hierachy in a source
analysys tool or IDE : but its 2 seconds work to set a break point
there and examine a few call stacks.

Yes, I'm an advocate. And astonished by the claims of some who say
they havent used a debugger more than twice in over 20 years of C programming.


Mar 12 '06 #26
On 2006-03-12, Nick Keighley <ni******************@hotmail.com> wrote:

no. Put an assert() in. A modified variable is just for today. An
approriate
assert() lasts forever. Next year when someone casts doubt on the same
function, you simply look at the func() source code and say "nope,
can't
be a divide by zero because there's an assert() to check for that". Fix
it
once, never touch it again.
Thats not the example I gave. I gave the example to force the divide
by zero error : with no added code which changes the footprint between
runtime and debug models. To expand a bit, I might want to check a
deeply embedded function can handle an integer limit or a null
pointer. One very important issue
here is that I dont do this all the time - only when its deep in a
system where a quick glance cant quarentee the absence of such
"limits". I might want to call "convertBitmap()" with option 3 for GIF
since the test data framework didnt incude that option. Or 6 which is
not yet implemented.

Bottom line is that also, some systems dont want to exit : they want
to handle the assert condition more generously. assert is one up from
printf mind you. At least you can compile it out without running
around changing all your source. Not that I agree with code footprint
which is different between "real" and "development".

Also, how many times have we found

assert(a++);

as opposed to "assert(a);a++" in some remote rarely called routine?

Bad? Yes. But shit happens.

If what you want is required then yes, an assert is useful.

a debugger war is much more fun...

It can be.. (ps this is far from OT : it is about using C language mechanisms
for debugging aid which overrule the need for a HW/SW debugger). If it
was in the std group then yes.

<snip>
>> > Maybe I'll even get my debugger out and step through a program
>> > sometime to see if it brings me any insights. Maybe you should try
>> > a "debugger free day" and try and see what you have to do to manage
>> > without. Try reasoning about the program. Consider invariants and
>> > post/pre conditions. Try adding asserts etc.
>>
>> If you recall I did mention that a home brew logging system is
>> prefential to printf :
>
> where did the words "homebrew logging system" appear in the paragraph
> above? I did write a fairly blunt paragraph here. And then thought
> better of it. Do you know what an invarient is? Design by contract?
I used those words. Its when you wrap whatever underlying logging system is
convenient to you in a fairly generic calling interface : so you could
log to files, text consoles, window systems whatever without changing
the calling code.


I note you didn't say if you knew what "Design by Contract" meant.


It has nothing to do with what I am advocating here. And so I wont get
dragged into a terminology war. I'm not talking about designing
anything here. And if I wanted to bluff, google is a mere keypress away
:-;

Remember that the crux of this is that I'm saying it can be
advantageous for someone wanting to familiarise themself with a system
to see it runing under a good debugger. You claimed this as "bizarre".
<snip>
>> A debugger isnt the only tool : and I never said it was. What it
>> can do is allow you to see the flow of an application while watching
>> live data, allow you to modify that data
>
> never felt the need. Not since compilers got fast enough to run during
> the day.
I really dont understand this. What has that got to do with a debugger?


I've modified variables using the debugger (the "debugger" actually had

toggle switches...) when the run time of the compiler was significant.
If it
took hours to recompile your code then machine code patches and
register
hacking was acceptable. It is no longer necessary (well, sometimes).


You must work on very small systems. I would never counter someone
modifying code here there and everywhere to *find* a bug when you can
get the same effect in less time without modifying the code.

What are toggle switches? What is a machine code patch in this
context? I would rarely advocate register hacking : the whole point of
a debugger is that it gives symbolic access to your system variables
and messing with registers is not required.
<snip>
>> and to examine and ensure
>> data typing is consistant from skillful use of register/pointer/memory
>> examinations : it is why they exist.
>
> you examine registers? On a deeply embedded system, ok. But a
>> server?
I also mentioned memory and pointers. And yes I do. Very useful in
debugging big C systems.


wow. Culture Shockville. To my shame I couldn't even tell you what
registers
my platform has without STFWing. Ok that's a project for today.

<snip>
>> The whole crux here is you doubting
>> that stepping an existing app can help a user understand it : after
>> many, many and varied projects on various platforms in various
>> languages I find it incredulous how you could doubt this would be
>> beneficial. In order to even put in these printfs() you need some
>> understanding of whats going :
<snip>
> I'd say the same went (in spades) for stepping. You've got to know
> where
> to step. There's source code, case tools, source browsers, source code.
> I've even resorted to grep to find callers of functions.


grep is ok : if I dont have a decent IDE I use something like emacs
tags. I have used grep in about 15 years :)


ok it's a tool of desparation. You * never* use grep?!


I used it today to find where a certain daemon was in a boot hierarchy
: nice.

OK, lets call it a day.
<snip>
--
Nick Keighley

Testing can show the presense of bugs, but not their absence.
-- Dijkstra

Mar 12 '06 #27

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
by: Ron Stephens | last post by:
The newly rechristened Python Learning Foundation is a web site dedicated to the assistance of people learning the Python programming language. Features include: 1. Daily lists of new and recent...
29
by: Jhon smith | last post by:
Hi,all,I was just wondering if I am likly to have any problems trying to learn C from older books,I have some from the late 80`s,mid/late 90`s. I am using Dev-C++ on the pc windows platform,But I...
4
by: Christian Blackburn | last post by:
Hi Gang, I saw this for sale online: Microsoft Visual Basic.NET Deluxe Learning Edition Version 2003. I'm wondering is the CD that's bundled with the learning edition just a digitized version of...
1
by: David Van D | last post by:
Hi there, A few weeks until I begin my journey towards a degree in Computer Science at Canterbury University in New Zealand, Anyway the course tutors are going to be teaching us JAVA wth bluej...
7
by: Max | last post by:
On monday I start a semester course in Python (the alternative was Java). I was looking through the course outline and noticed the following: 1) UserDict is used. This is deprecated, right? 2)...
36
by: utab | last post by:
Dear, I have experince in C( numerical projects, like engineering problems, scientific applications) I have the basic notion of C++ also, I have read Accelerated C++ until Chapter 7, however it...
2
by: bokiteam | last post by:
Hi All, Here is my idea to save learning time - Personal learning book. What we really need is somebody really familiar our learning experience, and then takes the example to teach us. Save...
78
by: arnuld | last post by:
hai all, i am standing on a "crossroad to C++". I am here in front of you as i have a problem. i will be brief. Please do not think: "arnuld is sick", i am really struggling & doing hard-work to...
0
by: LK~ICT | last post by:
Sri Lanka rural e-learning project seeks corporate support Dec 04, 2007 (LBO) - A Sri Lankan e-learning initiative for rural students is seeking corporate sector support to expand and cover 400...
16
by: John Salerno | last post by:
Just something that crosses my mind every time I delve into "Learning Python" each night. Does anyone see any value in learning Python when you don't need to for school, work, or any other reason?...
0
by: DolphinDB | last post by:
The formulas of 101 quantitative trading alphas used by WorldQuant were presented in the paper 101 Formulaic Alphas. However, some formulas are complex, leading to challenges in calculation. Take...
0
by: ryjfgjl | last post by:
ExcelToDatabase: batch import excel into database automatically...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: Vimpel783 | last post by:
Hello! Guys, I found this code on the Internet, but I need to modify it a little. It works well, the problem is this: Data is sent from only one cell, in this case B5, but it is necessary that data...
0
by: jfyes | last post by:
As a hardware engineer, after seeing that CEIWEI recently released a new tool for Modbus RTU Over TCP/UDP filtering and monitoring, I actively went to its official website to take a look. It turned...
0
by: ArrayDB | last post by:
The error message I've encountered is; ERROR:root:Error generating model response: exception: access violation writing 0x0000000000005140, which seems to be indicative of an access violation...
0
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.