473,395 Members | 1,891 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Analysis Paralysis

After I read Kernighan and Ritchie, I realized I knew nothing about C.

After I read Lippman, I realized I knew nothing about C++.

After I read Meyers, I realized I knew nothing about OO.

After I read Gamma, I realized I knew nothing about Patterns.

After I read Stroustrup, I realized I knew nothing about the STL.

Now that I'm reading Alexandrescu, I realize I know nothing about
Templates.

My exasperated question to you then, dear Usenet, is this:

....
Is it safe to start coding now? ;-)


That is to say, at what point can I feel brazen enough to battle this
Wes Craven-ish dread that the next book I pick up is going to very
politely, very eloquently, and very, very sensibly inform me that
everything I've been doing is wrong?

With tongue firmly in cheek,
(yet eyes firmly fixated on responses)

-Jeff

Sep 10 '06 #1
17 2167
Jeff_Ch wrote:
That is to say, at what point can I feel brazen enough to battle this
Wes Craven-ish dread that the next book I pick up is going to very
politely, very eloquently, and very, very sensibly inform me that
everything I've been doing is wrong?
If you read, and even if you completely understand, all good programming
books in the world, but have not programmed anything, you know nothing.

Start coding now, don't care if you don't use the most recommended, robust,
portable and reusable way in each and all cases or not.

--
Salu2
Sep 10 '06 #2
In article <11**********************@i42g2000cwa.googlegroups .com>,
je*********@cox.net says...
After I read Kernighan and Ritchie, I realized I knew nothing about C.
[ mention of more fine books elided .... ]
Is it safe to start coding now? ;-)
Sure.
That is to say, at what point can I feel brazen enough to battle this
Wes Craven-ish dread that the next book I pick up is going to very
politely, very eloquently, and very, very sensibly inform me that
everything I've been doing is wrong?
If you treat "open to improvement" as meaning wrong, then you might as
well give up now -- programming is basically little more than a precise
expression of thoughts. Given the (huge) difference between how we have
to express our thoughts for the computer to understand, and the typical
way of expressing thoughts otherwise, there's still room for drastic
improvement. Given people's inventiveness in this area, I expect to see
that improvement to continue happening for a long time to come.

--
Later,
Jerry.

The universe is a figment of its own imagination.
Sep 10 '06 #3
Jeff_Ch wrote:
Is it safe to start coding now? ;-)
Nope. Read Refactoring, Refactoring to Patterns, and Test Driven
Development.

After you do, it's always safe to start coding.

One fallacy of reading all those other books is they can make you think that
all of OO and design means you must get the design right, the first time, or
you are screwed.

The truth is that OO has always meant you can change a design on the fly, as
you add features that need new design elements. All those old authors
followed practices very similar to refactoring, but they never bothered to
tell you how. (And many other authors actually tell you that only wise
programmers design up-front.)
That is to say, at what point can I feel brazen enough to battle this
Wes Craven-ish dread that the next book I pick up is going to very
politely, very eloquently, and very, very sensibly inform me that
everything I've been doing is wrong?
Why read the f---er otherwise??

--
Phlip
http://www.greencheese.us/ZeekLand <-- NOT a blog!!!
Sep 10 '06 #4
Julián Albo wrote:
If you read, and even if you completely understand, all good programming
books in the world, but have not programmed anything, you know nothing.
Excellent advice! My apologies however, for I'm now scrolling up and
realizing I took way too many poetic liberties in my original post
(note to self: more sleep, less coffee).

I should have phrased my question much more succinctly as:

Given the considerable growth spurts over the last decade or so in
understanding the subtleties of this language, and given the (broad)
list of topics covered above, are there any other areas of current
exploration/research which come to mind that a 21st century coder would
feel he/she shouldn't be without? I'm mostly thinking along the lines
of what Boost/gnu might be toying with, but in the framework of the
current standard, specifically if there may be dangers lurking about
that stdlib0x would set out to tame.
Jerry Coffin wrote:
If you treat "open to improvement" as meaning wrong, then you might as
well give up now -- programming is basically little more than a precise
expression of thoughts. Given the (huge) difference between how we have
to express our thoughts for the computer to understand, and the typical
way of expressing thoughts otherwise, there's still room for drastic
improvement. Given people's inventiveness in this area, I expect to see
that improvement to continue happening for a long time to come.
Later,
Jerry.

Thanks Jerry ... indeed the inventiveness is what has me in such a
frenzy to gain some sort of confidence that I might be 'up to speed',
in some sense! I find it interesting tho, that it seems schools of
thought about C++ have changed dramatically over the years (to me at
least!) but not in response to a change in the language or an enhanced
capability, but in the way the fundamentals are viewed. Perhaps this
impression is simply my own personal biases getting in the way.

I suppose the tone I'm attempting to take here (given the sleep/coffee
caveat above), is along the lines of Scott Meyers' Forward in the 2nd
ed of Alexandrescu's book, where he describes a hesitation to include
templates in his own book, knowing full well that a revolution of
template understanding was on the horizon.
So I guess all I'm really asking is .... anyone notice any other
revolutions that might be hiding around the next corner?

Thanks again all,
-Jeff

Sep 10 '06 #5

Phlip wrote:
Jeff_Ch wrote:
Is it safe to start coding now? ;-)

Nope. Read Refactoring, Refactoring to Patterns, and Test Driven
Development.
Thanks Philip, duly noted. I had a sneaking suspicious that
Refactoring was going to pop up in this discussion.

After you do, it's always safe to start coding.
Ah yes, I see I made an utterly poor choice of words up there ...
really I just meant to use "start coding" as a metaphor for some sort
of wishy washy statement like "feel confident my programs are both
robust and resilient". Nevertheless, duly noted as well. :)
One fallacy of reading all those other books is they can make you think that
all of OO and design means you must get the design right, the first time, or
you are screwed.
Now you're directly addressing a concern of mine ....

The truth is that OO has always meant you can change a design on the fly, as
you add features that need new design elements. All those old authors
followed practices very similar to refactoring, but they never bothered to
tell you how. (And many other authors actually tell you that only wise
programmers design up-front.)
Okay this seems much more reasonable, and comforting in that it's much
closer to what I believed before. I've spent the last few years on a
project in ummm .... < ... lowers voice considerably ... something
seventyseven....>. Now that I'm free again to quit thinking
procedurally, I did a quick topical review of what you good folks have
been up to. However, a Wikipedia search gives me the (I assume safe to
say) 'false' impression that if I don't understand why a particular
squiggly line on an OMT diagram is pointed up vs down on a RUP Use Case
thingy, then I might as well give up and find a job pulling cables.

Certainly these design patterns seem to have quite a bit of merit, I
would guess moreso in the apps industry than in M&S where I'm coming
from (although I do wonder if there's anything to gain by viewing the
scientific method through this view). But, more recent literature
cautions strongly against what I gather to be some basic tenents of
patterns, for one the overhead of run-time polymorphism (if I'm
interpreting all this correctly).

So needless to say, before I get to work on the next 50k sloc sim, I'm
pretty damn curious which idioms I can still reliably put my faith
into, and which may have fallen by the wayside.

That is to say, at what point can I feel brazen enough to battle this
Wes Craven-ish dread that the next book I pick up is going to very
politely, very eloquently, and very, very sensibly inform me that
everything I've been doing is wrong?

Why read the f---er otherwise??
Read??? You mean he's produced something other than Freddie?

Sep 10 '06 #6
Jeff_Ch wrote :
Given the considerable growth spurts over the last decade or so in
understanding the subtleties of this language, and given the (broad)
list of topics covered above, are there any other areas of current
exploration/research which come to mind that a 21st century coder would
feel he/she shouldn't be without?
Concurrent programming maybe.
Sep 10 '06 #7
In article <11**********************@d34g2000cwd.googlegroups .com>,
je*********@cox.net says...

[ ... ]
So I guess all I'm really asking is .... anyone notice any other
revolutions that might be hiding around the next corner?
It's hard to say -- but only a question of timing, not whether there
will be revolutions.

For years I've maintained that programming is following much the same
route as database management did years before. We started with more or
less ad-hoc flat-file management of our data (functions, mostly). We
progressed to single-inheritance hierarchies, which are a whole lot like
hierarchical databases. The next step after that in database technology
was network databases, which are quite similar to multiple-inheritance
in programming.

With generic programming, we're working in the general direction of a
relational database -- instead of directly defining the relationships in
terms of inheritance, we can write something generically that simply
requires the right contents/capabilities in the type being manipulated.

Right now in C++, however, we're going a lot of that based on naming
conventions -- for example, the standard library contains quite a few
utility classes (e.g. std::unary_function) that do virtually nothing but
establish a set of names for specific items to more or less enforce
those naming conventions. We also have a fair amount of code that uses
(for example) less<Tinstead of operator<, because the former provides
us with a uniform name, whereas the latter doesn't really even have a
name when applied to some (e.g. built-in) types.

This leaves a lot of possible problems. One is that it's fairly
difficult to directly express the requirements of a particular template,
but easy to write code that can work to the extent of compiling, but
breaks all sorts of templates. Another is that it simply adds quite a
bit that needs to be learned, much of it oriented far more toward
memorization than any real understanding.

I'm not at all sure anything along that line will really be the next big
thing though. I think we've been moving in that general direction for
quite a while. Intentional Programming was intended (no pun intentional
or intended ;-) to provide something more or less along that line -- but
it doesn't seem to have taken the world by storm, or anything like that.
The same seems to be true of quite a bit of work with things like
explicit categories of iterators.

One big difference is that in C++ (or anything similar), nobody is very
willing to take a big performance hit just because something might fit a
cool new model. OO was around for a long time, but stayed decidedly on
the sidelines until people worked out ways of making it at least
reasaonbly competitive in terms of runtime speed and such. Even so, as
you've noted, modern C++ often makes heavy use of templates. Part of
this is the run-time cost of purely OO-based designs.

With that in mind, I think we're going to see a lot more work in the
general direction of the relational model, but probably not a wholesale
movement that ignores efficiency almost completely, like happened in
databases (even with advances in hardware, many SQL databases are still
roughly on a par with hierarchical databases running on '60s hardware).

--
Later,
Jerry.

The universe is a figment of its own imagination.
Sep 10 '06 #8

Jeff_Ch wrote:
After I read Kernighan and Ritchie, I realized I knew nothing about C.

After I read Lippman, I realized I knew nothing about C++.

After I read Meyers, I realized I knew nothing about OO.

After I read Gamma, I realized I knew nothing about Patterns.

After I read Stroustrup, I realized I knew nothing about the STL.

Now that I'm reading Alexandrescu, I realize I know nothing about
Templates.
For heavens sake, stop reading and start writing code! Without feedback
from real use, you'll never know what of all that is useful to you and
for what you are doing.

-- Bjarne Stroustrup; http://www.research.att.com/~bs

Sep 11 '06 #9
Jeff_Ch wrote:
<..snipped and reinstated below..>

bjarne wrote:
For heavens sake, stop reading and start writing code! Without feedback
from real use, you'll never know what of all that is useful to you and
for what you are doing.

-- Bjarne Stroustrup; http://www.research.att.com/~bs

Okay, as I sit here now with a silly grin across my face, I realize
just how much I botched my introduction. I suspect the Jersey sarcasm
doesn't translate well into print ;) Please allow me to start over...
Hey now folks, my name's Jeff. I've been a big fan of usenet since
deja days, but this is my first foray into posting. My background is
in physics and math, so I'm mostly programming from a mod and sim
point-of-view. As you can guess, C and Fortran comprised a major part
of my development, which for me has been the last 15 or so years.
Roughly 7-8 years ago I was working with Ada, and mildly poking around
in 'C w/ Classes', but really didn't start seriously considering
abstraction until '02ish. I've had the misfortune of spending the last
3 years buried firmly in f77, but now I can give C++ the majority of my
attention again.

[Here's the major clarifier I had blatantly disregarded above]
<wait for it...>
Having spent some quality time with books from these various authors
==over the years==, I can loosely couple the texts with the revelations
I experienced while reading them. With amateur poetic flare, they are:
After I read Kernighan and Ritchie, I realized I knew nothing about C.
After I read Lippman, I realized I knew nothing about C++.
After I read Meyers, I realized I knew nothing about OO.
After I read Stroustrup, I realized I knew nothing about the STL.
(now to be fair, the time difference between them should probably a
logplot... )

Having been tasked to a C++ project, I gleefully surveyed usenet and
the web to see what I've been missing (yay for Wikipedia!). The
overwhelming majority of it seemed to be modeling language related,
which I'm guessing is being required at most universities these days.
This was a major source of distraction for me, but underneath the
barrage of various flowchart definitions were these design patterns
that I still wasn't very familiar with, and they were obviously drawing
the attention of ppl interested in much more than classroom semantics.
After I read Gamma, I realized I knew nothing about Patterns.
So I went back to Gamma, had some fun with diagrams, coded up a few
ideas using them, and felt (or thought I felt) comfortable enough to
start blending these notions in with the project code. And here we
are:
Now that I'm reading Alexandrescu, I realize I know nothing about
Templates.
Now if you've stuck with this boring story so far, even after my
unintended initial misdirection, I do hope I can express my question
better, and that it might still have a chance of sparking interesting
discussion. ;-)

The discoveries that Templates have led to (and I'm conjecturing here,
please correct as seen fit!) provide a view of patterns that isn't
particularly congruent with the "textbook" approach taken around the
turn of the century. This is what I've been eagerly wrapping my brain
around most recently.

Now, had I gone ahead and barreled through implementing designs carte
blanche from GoF, at some point I'd imagine I would run into the
frustrations that Templates help overcome, and would end up reverse
engineering, (errr... 'refactoring' I suppose) a large portion of the
run-time polymorphism with the Template methods that I'm learning about
now.

This would have made Jeff very sad. Although probably all the wiser
for it.

Nevertheless, since it's on the customer's dime, I'm naturally led to
wonder,
==here it is, the real question!==

~~~~~~~~~~~~~~~~~~~
What other hidden gems have been discovered that I might want to
consider? Before pouring out many a class/object/template, are there
other items of interest not so easily found on Wikipedia that a
proactive programmer might was to at least assess first?
~~~~~~~~~~~~~~~~~~~

I keep looking in the direction of the Boost library, knowing that
standard, portable language features are brewing in there for a
not-too-distant release. I get the impression that understanding of
the non-trivial behavior in smart pointers is growing rapidly, so I'd
probably prefer to let them continue to incubate before attempting to
include them.
And now that I've thoroughly embarrassed myself (which is both fitting
and proper), I bid you fine folk a good night. :-)

Sep 11 '06 #10
Jeff_Ch wrote:
...With amateur poetic flare, they are:
After I read Kernighan and Ritchie, I realized I knew nothing about C.
After I read Lippman, I realized I knew nothing about C++.
After I read Meyers, I realized I knew nothing about OO.
After I read Stroustrup, I realized I knew nothing about the STL.
I got it. It's one of those Zen-like progression poems.

Nature and her laws lay hid by night.
God said, Let Newton be, and all was light.
The Devil, howling Ho let Einstein be
Restored the status quo.
Having been tasked to a C++ project, I gleefully surveyed usenet and
the web to see what I've been missing (yay for Wikipedia!). The
overwhelming majority of it seemed to be modeling language related,
which I'm guessing is being required at most universities these days.
Right. Very little of the practice is modeling, yet you must know how. You
should be able to go from a diagram to code, or from code to a diagram.
Now if you've stuck with this boring story so far, even after my
unintended initial misdirection, I do hope I can express my question
better, and that it might still have a chance of sparking interesting
discussion. ;-)

The discoveries that Templates have led to (and I'm conjecturing here,
please correct as seen fit!) provide a view of patterns that isn't
particularly congruent with the "textbook" approach taken around the
turn of the century. This is what I've been eagerly wrapping my brain
around most recently.
Curiously, the biggest division with OO is languages that default to static
typing, and those that default to dynamic typing. Where dynamic typing often
meant "you can put in just any old type", that's what generics and templates
are bringing to C++.
What other hidden gems have been discovered that I might want to
consider?
Test-Driven Development, Design by Contract, and Aspect Oriented
Programming.
Before pouring out many a class/object/template, are there
other items of interest not so easily found on Wikipedia that a
proactive programmer might was to at least assess first?
Block closures. But you can't get them in C++, yet.

However...

--it's safe to start a program without knowing any of this stuff <--

You should know the basics, like how not to corrupt memory. But after that,
just write a simple design that satisfies your customers, and repeat. Leave
the hard stuff for the next time you get bored.

(BTW per the original answers, I think mine was closest to your intended
question!)

--
Phlip
http://www.greencheese.us/ZeekLand <-- NOT a blog!!!
Sep 11 '06 #11

Phlip wrote:
Nature and her laws lay hid by night.
God said, Let Newton be, and all was light.
The Devil, howling Ho let Einstein be
Restored the status quo.
hu hu hu

The coolest thread ever :-)

Sep 11 '06 #12

Julián Albo wrote:
Jeff_Ch wrote:
That is to say, at what point can I feel brazen enough to battle this
Wes Craven-ish dread that the next book I pick up is going to very
politely, very eloquently, and very, very sensibly inform me that
everything I've been doing is wrong?

If you read, and even if you completely understand, all good programming
books in the world, but have not programmed anything, you know nothing.
Sure you do. You know how to write theoretically perfect code to solve
purely hypothetical problems.

Sep 11 '06 #13

Noah Roberts wrote:
Julián Albo wrote:
If you read, and even if you completely understand, all good programming
books in the world, but have not programmed anything, you know nothing.

Sure you do. You know how to write theoretically perfect code to solve
purely hypothetical problems.
Sounds like what I learned when I was in college ;-)

Sep 11 '06 #14
Noah Roberts wrote :
Sure you do. You know how to write theoretically perfect code to solve
purely hypothetical problems.
Which is way more interesting anyway than writing cheap apps for some
random company.

Sep 11 '06 #15

loufoque wrote:
Noah Roberts wrote :
Sure you do. You know how to write theoretically perfect code to solve
purely hypothetical problems.

Which is way more interesting anyway than writing cheap apps for some
random company.
Well, I don't spend a lot of time working for some random company. I
usually work for one, the one that pays me. I think they might object
if I worked for random companies and they certainly wouldn't pay me for
the time spent with the other companies.

Sep 11 '06 #16

Noah Roberts wrote:
loufoque wrote:
Noah Roberts wrote :
Sure you do. You know how to write theoretically perfect code to solve
purely hypothetical problems.
Which is way more interesting anyway than writing cheap apps for some
random company.

Well, I don't spend a lot of time working for some random company. I
usually work for one, the one that pays me. I think they might object
if I worked for random companies and they certainly wouldn't pay me for
the time spent with the other companies.
You are still writing cheap apps to your ONE company :-)

Sep 12 '06 #17

Diego Martins wrote:
Noah Roberts wrote:
loufoque wrote:
Noah Roberts wrote :
>
Sure you do. You know how to write theoretically perfect code to solve
purely hypothetical problems.
>
Which is way more interesting anyway than writing cheap apps for some
random company.
Well, I don't spend a lot of time working for some random company. I
usually work for one, the one that pays me. I think they might object
if I worked for random companies and they certainly wouldn't pay me for
the time spent with the other companies.

You are still writing cheap apps to your ONE company :-)
Hehehe, actually we charge quite a sum for our products.

I better not say any more :P

Sep 12 '06 #18

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
by: beliavsky | last post by:
If I run PyChecker on the following program, stored in xtry.py, m = 10000000 k = 0 for i in xrange(m): k = k + i print k x = range(3) print x
3
by: Daniele | last post by:
I have a 40 MB database in excel format. I need to use it in Analysis Services, I imported the data by DTS (Data Transformation Services), everything is working I can see the database, but I can't...
1
by: BruceGilpin | last post by:
I was at a Microsoft sales presentation last week for the new version of SQL Server, Yukon. They had an extensive presentation on SQL Server and Reporting Services but didn't mention Analysis...
2
by: Derek | last post by:
This isn't exactly a language question, but I'm curious if any of the veteran programmers out there could recommend a static analysis tool for C++. Specifically, I'm looking for something that...
0
by: wwalkerbout | last post by:
Greetings, Although, this relates to Analysis Services administration, I thought I'd post it here in case someone with the administrative side of SQL Server who subscribes to this Group may also...
1
by: Ben | last post by:
I have written a procedure which calls the CORREL function of Excel to run correlation analysis on two arrays, then populate a table with the resulting correlation coefficient. This process loops...
5
by: Ray Tomes | last post by:
Hi Folks I am an old codger who has much experience with computers in the distant past before all this object oriented stuff. Also I have loads of software in such languages as FORTRAN and...
0
by: exits funnel | last post by:
Hello, I apologize if this question is a bit vague and slightly off topic but I couldn't find an Analysis Services and/or ODBO specific newsgroup. In any event, I'm trying to address an issue...
0
by: tavares | last post by:
--------------------------------------------------------------------------------------------------------------------------------------------- (Apologies for cross-posting) Symposium...
0
by: tavares | last post by:
------------------------------------------------------------------------------------------------------------------------------------------- (Apologies for cross-posting) Symposium...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.