473,385 Members | 2,015 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

Future reuse of code

Hi I'm developing a program and the client is worried about future
reuse of the code. Say 5, 10, 15 years down the road. This will be a
major factor in selecting the development language. Any comments on
past experience, research articles, comments on the matter would be
much appreciated. I suspect something like C would be the best based
on comments I received from the VB news group.

Thanks for the help in advance

James Cameron
Jul 19 '05
242 13074
On Thu, 07 Aug 2003 04:44:39 GMT, "jce" <de*********@hotmail.com>
wrote or quoted :
are gullible) actually believed it for a moment :-)

http://www.sdmagazine.com/documents/s=819/sdm0204f/

I think it could be doable within a certain problem domain, for
example setting up simple databases with data validation, report
generation.

The interactions need not be free form English. They could be fill in
the blanks or interactive questioning.
--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Jul 19 '05 #101
On Fri, 08 Aug 2003 08:47:14 -0400, Joe Zitzelberger
<jo**************@nospam.com> wrote or quoted :

I prefer assembly language to everything...what does that mean?


The great appeal of writing the core of my Forth/Abundance interpreter
in assembler was that I knew exactly what was going on in side down to
the bit level. Nothing was happening I did not know about. This
desire for microcontrol and perfection comes best from writing in
Assembler. The only idiots you have to swear at are the folk who
designed the instruction set.

--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Jul 19 '05 #102
Paul Hsieh wrote:

COBOL and Pascal (the other groups you crossposted this message to)
will decrease in usage over time, not increase. There is absolutely
no new serious development being done in either language. In 15
years, Pascal will probably be completely dead, and the COBOL
community will be reduced even from the size of today's community
(human mortality alone will guarantee this.)


This may be true for COBOL, but Pascal is very much alive and kicking,
in the form of Delphi/Kylix. I am currently writing Kylix software, most
of the cutting edge routines (that do the real work rather than the user
interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now
with Borland going for cross-platform (Windozze/Unix) compatibility
there is no reason why Pascal should die in the foreseable future.
Jul 19 '05 #103
"Howard Brazee" <ho****@brazee.net> wrote in message news:<bh**********@peabody.colorado.edu>...
On 9-Aug-2003, ru**@webmail.co.za (goose) wrote:
who do you think will write a program that will run (recompiled if necessary)
on the greatest number of machines ?
Greatest number of machines.

I don't think someone writing a business application cares about, say
stoplights. But stoplights are machines with computer programs in them.

So the question should be - which language gives me an advantage in reaching
more prospective paying customers for my product with the least cost to me?


the original statement (which was snipped) was
----
Java has another huge advantage - it runs on anything without having to
spend more money.
<howls of laughter> pull the other one sonnyboy, its got bells on :-)


OK, I exaggerated. But it runs on a lot more platforms than anything else
without costing more.

----

I have already pointed out that this is not true.

If my application runs best on big iron, that may be CoBOL. (Good for me, that
is my native programming language).
If my application is to show me on my hand held which golf club I need for my
next shot (according to my past history, a map of the course, and the GPS
satellite), then CoBOL isn't a good choice.

But if I am wanting to create a program that all of the students in a university
can use to interface with the campus's computers - I can assume most of them can
already run my XML and Java code.


and yet creating a std C program would not only get you that, it would also
get you a fairly snappy application *and* leave you open in the future
to be able to support those people who have machines that are not
capable of running java (certain designer palmtop-types) to *also*
interface with the campus machines.

java doesn't *buy* you anything extra in terms of portability.
The only relatively *portable* way I can think of is when writing
applets for web-pages (note: /relatively/). as long as the browser
has a java runtime environment, of course.

Java does have its advantages. Portability isn't one of them.

hth
goose,
I feel very strongly about the "while" loop. I suggest we take
it hostage to demand the release of the "goto" ;-)
Jul 19 '05 #104
In article <3f********@news.athenanews.com>, Peter E.C. Dashwood wrote:

"Dr Engelbert Buxbaum" <en***************@hotmail.com> wrote in message
news:bh*************@news.t-online.com...
Paul Hsieh wrote:

> COBOL and Pascal (the other groups you crossposted this message to)
> will decrease in usage over time, not increase. There is absolutely
> no new serious development being done in either language. In 15
> years, Pascal will probably be completely dead, and the COBOL
> community will be reduced even from the size of today's community
> (human mortality alone will guarantee this.)
This may be true for COBOL, but Pascal is very much alive and kicking,
in the form of Delphi/Kylix. I am currently writing Kylix software, most
of the cutting edge routines (that do the real work rather than the user
interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now
with Borland going for cross-platform (Windozze/Unix) compatibility
there is no reason why Pascal should die in the foreseable future.


There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)


Really? Please name and discuss them.
They are the number of people who access the internet every day. (For the
sake of this argument, I'll call them the "user base"...) They are not about
to become "computer programmers".
Indeed.
Instead, they will demand better interfaces, smarter software,
True
and MUCH better ways of developing computer systems than sequential Von
Neumann code.
On the contrary, specially for these kinds of users, sequential jobs are a
way of thinking that is normal to them.
Most of them are "smarter" and more "computer literate" than their
prdecessors of even 10 years ago.
Yes. They are not scared anymore. OTOH the requirements on them have severly
increased also. I sometimes doubt if increased computer literacy actually kept
up with the added computer tasks for the avg person.
They are not intimidated by computer technology, will happily interact
with smart software to achieve a result, and are not prepared to rely on
and wait for, remote, faceless, technocrats to provide them with computer
solutions to business problems.
Yes, they want smug buzzword talking con-men to take advantage of them ?
We may have our own favourite Languages and we can poddle away in a corner
somewhere cutting code for the fun of it, but the real world demands that it
get solutions.
Exactly. So as long as my solution is good, and I can justify using a language,
waht is the problem.
By 2015 a new generation of development software will see "programmers"
removed from the loop and end users interacting and iterating with smart
software until they get what they want.
Sure. The telepathic kinds.
Procedural code is already into Gotterdammerung.
It takes too long, requires too much skill,
Programming is what requires the skill. Not the language. If you studied programming
closer, you'd know that.
is too inflexible (the accelerating rate of change in the Marketplace and
in technology is another reason why it is doomed to extinction) and,
overall, costs far too much.
And where are you references for that. You don't even say what it is up
against, except some vague references about software which is going to
emerge as a winner in 2015 (and which I assume is telepathic, at least if I
see your description)
skills... Why bother? Why should an Insurance company spend $50,000,000 a
year on in house IT when they could buy the service for $10,000,000?
Ah, but could they, and with the same secondary securities? Price is not the only
point of competition.
The only thing that COULD save procedural coding of solutions would be if
it priced itself back into the market. This MIGHT happen with offshore
outsourcing, but it is unlikely.

Bottom Line: Don't get smug about COBOL dying and PASCAL surviving; they are
on the same parachute and the ground is coming up....


Bottom Line: I think we can safely award you the "troll of the week" award, with
"don't panic" in nice friendly letters.
Jul 19 '05 #105


"Peter E.C. Dashwood" wrote:

"Dr Engelbert Buxbaum" <en***************@hotmail.com> wrote in message
news:bh*************@news.t-online.com...
Paul Hsieh wrote:

COBOL and Pascal (the other groups you crossposted this message to)
will decrease in usage over time, not increase. There is absolutely
no new serious development being done in either language. In 15
years, Pascal will probably be completely dead, and the COBOL
community will be reduced even from the size of today's community
(human mortality alone will guarantee this.)
This may be true for COBOL, but Pascal is very much alive and kicking,
in the form of Delphi/Kylix. I am currently writing Kylix software, most
of the cutting edge routines (that do the real work rather than the user
interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now
with Borland going for cross-platform (Windozze/Unix) compatibility
there is no reason why Pascal should die in the foreseable future.


There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)


.... and replaced by what?

In the early 80-es there was a hype on PROLOG: The japanese are working
with PROLOG and 10 years from now PROLOG will replace traditional procedural
computer languages completely. So, where is PROLOG today, 20 years later?

[snip a lot of interesting thoughts]
Bottom Line: Don't get smug about COBOL dying and PASCAL surviving; they are
on the same parachute and the ground is coming up....


Procedural languages will be there for a long time. The languages may be different,
but still use the same principle. Knowing how to program in this paradigm will still
be the entry key to programming those languages. The rest is syntactic
sugar (simplified).

--
Karl Heinz Buchegger
kb******@gascad.at
Jul 19 '05 #106

"Karl Heinz Buchegger" <kb******@gascad.at> wrote in message
news:3F***************@gascad.at...


"Peter E.C. Dashwood" wrote:

"Dr Engelbert Buxbaum" <en***************@hotmail.com> wrote in message
news:bh*************@news.t-online.com...
Paul Hsieh wrote:
> COBOL and Pascal (the other groups you crossposted this message to)
> will decrease in usage over time, not increase. There is absolutely
> no new serious development being done in either language. In 15
> years, Pascal will probably be completely dead, and the COBOL
> community will be reduced even from the size of today's community
> (human mortality alone will guarantee this.)

This may be true for COBOL, but Pascal is very much alive and kicking,
in the form of Delphi/Kylix. I am currently writing Kylix software, most of the cutting edge routines (that do the real work rather than the user interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now
with Borland going for cross-platform (Windozze/Unix) compatibility
there is no reason why Pascal should die in the foreseable future.
There are 400,000,000 reasons why ALL procedural languages (including COBOL and PASCAL) should "die" in the not-too-distant future. (I don't know your definition of "foreseeable" but mine is around 20 years...)


... and replaced by what?

In the early 80-es there was a hype on PROLOG: The japanese are working
with PROLOG and 10 years from now PROLOG will replace traditional

procedural computer languages completely. So, where is PROLOG today, 20 years later?

[snip a lot of interesting thoughts]
Yes, I remember the Japanese PROLOG push and the drive to develop the first
AI Operating System.

It certainly failed.

So did attempts to build a lacemaking machine in the late 18th century in
England. The received wisdom was that it was impossible because the process
of making lace was just too intricate.

It took countless attempts, ruined families, suicides, and 30 years, but the
machine is viewable today in the Lace museum in Nottingham.

To answer your very fair question (... and replaced by what?), I believe
that new methodologies for system development will arise in response to the
pressure from the Marketplace. I have already seen interesting departures
from traditional methods that achieved much faster results and were much
more flexible. The key to these approaches is a more RAD like process with
iteration and interaction by users. Currently, we have programmers and
"Quick Build" tools in the loop, but it is only a matter of time before
smarter software will take on these functions. Eventually, end-users will
interact with smart software to achieve what they want, and there will be no
programmer in the loop at all.

There is far too much on this to go into here (sorry, I know that sounds
like a cop out, but I have been writing on this subject for some years now
and have been using alternative approaches in the real world in industry
with results that are very encouraging.), but I will close by saying that
everything I am saying is simply extrapolation from what is happening NOW. I
claim no psychic powers, just good observation and a lifetime of experience
in IT.
Bottom Line: Don't get smug about COBOL dying and PASCAL surviving; they are on the same parachute and the ground is coming up....
Procedural languages will be there for a long time. The languages may be

different, but still use the same principle. Knowing how to program in this paradigm will still be the entry key to programming those languages. The rest is syntactic
sugar (simplified).


Well, time will tell...<G>

Pete.
Jul 19 '05 #107

On 12-Aug-2003, "Peter E.C. Dashwood" <da******@enternet.co.nz> wrote:
I wonder why your response is so vitriolic?
Really?
I didn't set out to attack you.
Agreed.
Could you be a little sensitive to the truth of what I'm saying?


Isn't that human nature - when the truth hurts?

-----
By far the most irrational character on Star Trek was the one who always was
amazed when people acted like people.
Jul 19 '05 #108

On 12-Aug-2003, "Peter E.C. Dashwood" <da******@enternet.co.nz> wrote:
Bottom Line: I think we can safely award you the "troll of the week"

award, with
"don't panic" in nice friendly letters.


Well, I always enjoyed the Hitchhikers Guide to the Galaxy, but I have never
been a troll. You have no idea who you are dealing with <G>.


The set of trolls includes a large number of trouble makers. So we tend to
deny that what we're doing is trolling, when our purpose is noble.

But a statement designed to gain a response still qualifies. We need more
intelligent, useful trolling.
Jul 19 '05 #109
"Peter E.C. Dashwood" wrote:
There are 400,000,000 reasons why ALL procedural languages (including COBOL
and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)


Pascal is not any more purely procedural than C++. Last time I checked, C++
still had functions. If you want to insist that Pascal has not evolved since
1973, then you are going to insist on being wrong.

--
For most men, true happiness can only be achieved with a woman.
Also for most men, true happiness can only be achieved without a woman.
Sharp minds have noted that these two rules tend to conflict.....
Jul 19 '05 #110
On Tue, 12 Aug 2003 23:27:07 +1200, "Peter E.C. Dashwood"
<da******@enternet.co.nz> wrote or quoted :
Procedural code is already into Gotterdammerung. It takes too long, requires
too much skill, is too inflexible (the accelerating rate of change in the
Marketplace and in technology is another reason why it is doomed to
extinction) and, overall, costs far too much.


What other options do we have?

1. OO -- also requires considerable skill.

2. FORTH where you create a language for solving problems in a
particular domain. The users of the language just string words
together. Usually only a handful of people understand how it works
under the covers.

3. Spreadsheets, where the emphasis is on relationships, not on
precise order of computation. The complexity is added gradually with
real life data used to test at every stage.

4. wizards, where you configure some generic application into a custom
app.

5. query by example.

6. training neural nets.
Spreadsheet logic is the one with the lowest threshold of technology
required to integrate it into Java. It should be possible to write
generic apps, e.g. a retail sales package, and have the customer or
someone with minimal skill, customise it with bits of spreadsheet
logic.

--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Jul 19 '05 #111
On Tue, 12 Aug 2003 12:06:10 +0000 (UTC), Marco van de Voort
<ma****@toad.stack.nl> wrote or quoted :
If you studied programming
closer, you'd know that.


This is an interesting conversation. If you would spare the nasty
barbs it would also be enjoyable. Take your bitterness out on someone
who deserves it. I could suggest some politicians, but that would
start a flame war.
--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Jul 19 '05 #112
In article <3f********@news.athenanews.com>, Peter E.C. Dashwood wrote:

On the contrary, specially for these kinds of users, sequential jobs are a
way of thinking that is normal to them.

Well, Marco, I wonder how long it is since you looked?


Well, I think I went to work today.
Software tools are already emerging that substitute iteration and
interaction for sequential processes.
Sure, for certain limited domains, the actual engineering is done,
and there is a nice tool to customize that in several ways.

Useful? Certainly. Timesaver? sure. Potential to be universal? No way.
SQL Server for example has a "drag and drop" tool that allows processing
streams to be built in minutes.
I've used laboratory controlling software which was nice in a lot of ways
too. It was truely useful, productive and IMHO the most important,
it avoided a lot of errors.

But I would never claim that such a thing could be generalized and replace
software engineering. They are problem-domain specific solutions, nothing
more.
These same streams using procedural code would take days.
Sure. But that doesn't spell the end of procedural programming.
What's more, if you get it wrong you can simply go to a graphic interface
and change it.
I do that with sequential programming (Delphi) too. Tools for a specific
domain. It saves time (and equally important) makes the product somewhat
easier to maintain.
I have seen at least one Graphic design package that uses a
similar principle. Non computer literate designers can easily manipulate
these tools, interact with them, iterate their processes, until they
achieve what they want.
Within very simple, limited borders. There is not one such tool that
replaces general programming (regardless of which paradigm you use)
Programming knowledge is NOT a requirement.
It is. Those environments are extended using normal programming, tackling
large projects still needs skill.

Those tools are just that, extra tools. Pretty comparable to fancy
runtime (and later classes-) libraries. None of them spelled the end
of programming either.

Hmm, I think that is a good description. Some extra tools to aid software
development, and allow a _user_ some customization.
Currently, tools like this are in their infancy. In 15 years we can expect
significant improvement.
Well. Then exactly this is our point of disagreement. Please explain
why you think (and how do you envision that will happen) how you get
from domain-specific solutions (math,laboratory handling, db handling)
to general purpose programming.

This is also what annoyed me about your previous message. It is a message
of a believer. There is no reasoning behind it. You only "get" it when
you are a believer.
increased also. I sometimes doubt if increased computer literacy actually

kept
up with the added computer tasks for the avg person.


The computer skills of the Business are rising very rapidly.


IT skills in using their own applications. Not in programming and
customization. In that department it got worse, especially in companies with
highly skilled (non-IT) technical people.

In the old days any beta-sciences student or graduate could do some
general purpose programming, usually they learned it so they could
program calculations. Nowadays they use mathlab, which is
absolutely great, but combined with faster machines requires less
programming skill to achieve the same result.
Which is good for them, but not for the level of programming knowledge
in a company.
Yes, they want smug buzzword talking con-men to take advantage of them ?

No, they're getting pretty wise to that one too...in fact, most of us are.


If that were the case, I wouldn't get +/- 50-100 spams a day.

Exactly. So as long as my solution is good, and I can justify using a

language,
waht is the problem.

It isn't enough just to provide a solution; it has to be an acceptable
solution.

That means using tools and methods the users are comfortable with.
I don't see why that would be the case? Its like a car mechanic who has to
fix a car with the tools an average person has in his house.
In 15 years they WON'T be comfortable with some old academic cobbling code
together for a solution... By then they will have bypassed the need for
coding and will be implementing their own solutions.
Amen:-) Still a little thin on reasons though.
That was the whole point of my argument. They are doing it already... More
and more Business departments are gaining enough computer literacy to
build their own systems using standard solutions like spreadsheets and
databases.
That kind of limited hobbying always has happened.
The last place I worked (a major utility in the Midlands of England) there
were more people in the Business with Computer Science degrees, than I had
on my IT staff.
Hmm. My former employer, which was IT related, had more chemists (including
me) than people with CS degrees.

There is no problem. I never suggested there was one. You can go ahead and
use procedural code for the rest of your natural life. (I intend to...). You
just won't make a lot of money at it. It'll become a "cottage industry" by
2020...<G>
A well. We have religious freedom :-)
> software until they get what they want.


Sure. The telepathic kinds.


The process of iteration, as you would know if you had ever worked in a RAD
environment, does not require telepathy.


Very interesting. Why wouldn't I've worked in a RAD environment? Telepathy
again?
Your scorn is misplaced. Interaction and iteration enable HUMAN intelligence
to get in the loop, but does NOT require specific technical (i.e.
programming) skill.
While I think in retrospect that my tone might have been misplaced,
your second post confirmed my suspicious. You have a firm believe in
something, and really want to advocate that. However I don't find
much evidence, not even shallow ones.

Except maybe that one story of a business department full of people
with CS degree's. Now that's surprising that they were more likely
to get something working using minimal and standard tools :-)
Programming is what requires the skill. Not the language. If you studied

programming
closer, you'd know that.


LOL! While I note that you are at a very reputable University (apparently
learning to use procedural code...)


I'm still honorary member of my former universities computer club.
I can assure you I have studied programming for the whole of my working
life (some 38 years - I started programming computers in 1965
What were YOU doing then <G>?) not behind
cloisters but in the real world. Leaving aside your intended slight, I agree
that programming does require skill, but it was you who turned my statement
into a separation between programming as a skill and programming as an art.
I said that "Procedural Coding" is in decline. That includes the Language
and the Art...
Yet, except a firm believe that ordinary users with a few standarised
tools will replace them, you don't reveal much reasons.
I wonder why your response is so vitriolic?
I didn't set out to attack you.
I've been on news for over a decade now. And on Fidonet before that. While
trolling and wild speculation presented as "truth" might seem innocent to
you, it doesn't to me. It poisons a group, and creates an unequal position
for discussion.

The problem is that you don't have to justify yourself in 15 years if you
are totally wrong, like you would in a company. It is nearly anonymous, easy
and safe, yet it still does damage.

(and btw, keep in mind nearly medium-longterm IT forecasts have been wrong
till now)

As said before Maybe I was to harsh, yet I still stand behind the original
intentions behind that message.
Could you be a little sensitive to the truth of what I'm saying?
Please don't degrade to amateur psychology. It's seriously flawed enough
already.
And where are you references for that. You don't even say what it is up
against, except some vague references about software which is going to
emerge as a winner in 2015 (and which I assume is telepathic, at least if

I
see your description)

The typical response of the student.


Again a belittling comment. Try to argue with something more substantial
arguments.
Are you saying that, without a
reference, you would question whether there is an accelerating rate of
change in computer technology?
No. I question if it goes in the direction that you say it is. So not
IF there will be change, just if it is going to be the change you proclaim.
OK, Alvin Toffler, Moore's Law, and the fact that I have to get a new
computer every 18 months...
Relates to programming how?
As for my knowledge of the Market place, I have worked in industry IT
services all my life. It is axiomatic to me that the Business needs are
accelerating and greater flexibility in response to changing and new
Markets is required in IT today than was the case even 5 years ago. I
don't need a text book to tell me this; my users are drumming it into me
every day... I can SEE the need for flexibility in system design and
implementation.
Sure, but you practically argue that this will replace software engineering.
Thank goodness there are tools and systems that are addressing this need.
(Client/Server, distributed networks, OOD and OOP are all paradigms that
are much more flexible than the traditional mainframe Waterfall
methodology, and coincidentally, none of them is tied to Procedural
Coding...)
We'll see. I consider OOP to be procedural programming too btw.
My figures are based on a real case. The Company concerned sold their IT and
leased it back. They did this when they had a bad year due to claims for
floods and droughts.
That's organisational detail.
It is interesting that in the "good" years they took no
action. Try telling a Board of Directors faced with a huge cash flow
requirement, that "Price is not the only point of competition". Even if
you're right (and I don't disagree with the statement) you will not help
your career...
Mine is practice too, everytime I argue on price, they come with "support"
(which they never use, and won't get), "security" (better large than
small company) etc.
award, with
"don't panic" in nice friendly letters.


Well, I always enjoyed the Hitchhikers Guide to the Galaxy, but I have never
been a troll. You have no idea who you are dealing with <G>.


One of the joys of usenet:-)
Jul 19 '05 #113
jce
"Marco van de Voort" <ma****@stack.nl> wrote in message
news:sl*******************@toad.stack.nl...
In article <3f********@news.athenanews.com>, Peter E.C. Dashwood wrote:
Well, Marco, I wonder how long it is since you looked?

Well, I think I went to work today.

Did you ask people whether "sequential jobs are a way of thinking that is
normal to them" or are you telepathic too?
But I would never claim that such a thing could be generalized and replace
software engineering. They are problem-domain specific solutions, nothing
more. Maybe not software engineering as it evolves...but as it exists now and in
parts, probably. Software will be around for a while...so there will be
software engineers.....but I was told my grandfather kept saying "machines
will build cars....humbug!".
Within very simple, limited borders. There is not one such tool that
replaces general programming (regardless of which paradigm you use) But it will replace large sections of general programming....in every
paradigm. I haven't written a math library recently.....I don't write gui
components much ...I don't write messaging software...I don't even have to
write my own storage/retrieval system...but I thought we were talking about
Software Engineering which is not exactly programming is it.
Currently, tools like this are in their infancy. In 15 years we can expect
significant improvement. In 1903 they flew for the first time...in 1969 came the 747 and we are still
using the 747....Time brings improvement but it's only grows with demand or
reward in the risk-reward stakes. If there is no reward then no one will do
anything. The world is littered with tools - there are tens of commercial
vendors with profiling, generating, visual assistance and yet not one of
them lets you get by without understanding what it is you are doing...many
are listed as pre-reqs for jobs because they aren't just <pick em up and use
em>.
Without uniform acceptance tools will improve but not replace software
skills. The skills may evolve and get better tuned or apt to deal with new
products - be faster, more flexible.....My car is essentially a souped up
model-T ;-)
The computer skills of the Business are rising very rapidly. I believe in the escalator principle...
Tools get developed to replace complex manual type function......
Those doing the manual function are replaced with the more technical
developers to setup the automated process.
Tools get developed to replace the complex automated type setup with a neat
gui tool
Those setting up the complex automated type function are replaced by the gui
experts who had special training....
ad infinitum...

The people at the bottom get off....the higher paid get on the top and ride
down...
The key is to make the journey last until you're 55.
Yes, they want smug buzzword talking con-men to take advantage of them

? No, they're getting pretty wise to that one too...in fact, most of us are. If that were the case, I wouldn't get +/- 50-100 spams a day. And you read them? or are you pretty wise to that one and delete them?
The last place I worked (a major utility in the Midlands of England) there were more people in the Business with Computer Science degrees, than I had on my IT staff. The escalator already started there then ....

While I think in retrospect that my tone might have been misplaced,
your second post confirmed my suspicious. You have a firm believe in
something, and really want to advocate that. However I don't find
much evidence, not even shallow ones. I don't find much in the way of evidence that you've presented *against* it
either.
The idea of a generalized tool is way out there (2015 isn't that long).
There are sure to be major inroads into large chunks of the IT industry. If
people latch onto a successful tool and it gains support then it could be
looked at in other areas. It depends on how the rich would benefit.
I've been on news for over a decade now. And on Fidonet before that. While
trolling and wild speculation presented as "truth" might seem innocent to
you, it doesn't to me. It poisons a group, and creates an unequal position
for discussion. I don't see the Troll here. He provides way too much useful input to groups
to be a "troll". When you crosspost you get to get input from all the
groups...most people are too busy being useful contributors in *all* groups.
The problem is that you don't have to justify yourself in 15 years if you
are totally wrong, like you would in a company. It is nearly anonymous, easy and safe, yet it still does damage. What damage does it do? More or less than off shoring...more or less than
war...more or less than enron...more or less than pension decreases...more
or less than the rising cost of insurance...more or less than the wealthy
become more so.....It's an opinion he has..Let him share it. It's
interesting...we can discuss it and decide for ourselves if it's crap. I
don't need you to protect me from anything.
As said before Maybe I was to harsh, yet I still stand behind the original
intentions behind that message. Just be nicer about it...else you get *plonked* and no one hears you then.
Your perfectly valid and salient points become valid and silent.
Could you be a little sensitive to the truth of what I'm saying?

Please don't degrade to amateur psychology. It's seriously flawed enough
already.

He's not degrading to amateur psychology..he's telepathic remember...:-)
The typical response of the student.

Again a belittling comment. Try to argue with something more substantial
arguments.

You started it....ha ha
Are you saying that, without a
reference, you would question whether there is an accelerating rate of
change in computer technology?

No. I question if it goes in the direction that you say it is. So not
IF there will be change, just if it is going to be the change you

proclaim.
Ok - so your defense is that nothing has worked before? What if we put the
date to 2050...does that change anything? The only major flaw I see in his
argument is (a) the scale with which Peter sees this occurring - ubiquitous
and (b) the time frame....I have no evidence or justification for this
position.
As for my knowledge of the Market place, I have worked in industry IT
services all my life. It is axiomatic to me that the Business needs are
accelerating and greater flexibility in response to changing and new
Markets is required in IT today than was the case even 5 years ago. I
don't need a text book to tell me this; my users are drumming it into me
every day... I can SEE the need for flexibility in system design and
implementation.

Sure, but you practically argue that this will replace software

engineering.
It will replace LARGE aspects of software engineering....the need for PMs,
the RA role will change, therefore the SE position will have to go with it
and most of the tasks will be automated.
Mine is practice too, everytime I argue on price, they come with "support"
(which they never use, and won't get), "security" (better large than
small company) etc.

That's organizational detail ;-)

JCE
Jul 19 '05 #114
Marco,

a good and fair response.

You have me down as a "believer"; I'm not. Neither am I trying to evangelise
ONE point of view.
(Been in this game to long...seen it all come and go...however, that does
not blind me to emerging trends and the fact that there is no requirement
for the future to be exactly like the past; because something failed in the
past doesn't mean it will not succeed the next time someone tries it (with
more knowledge and experience)).

I really don't mind if people disagree with what I'm saying. (At worst, the
ideas presented will have made them think; at best, they will have enjoyed
my post.)

But I am capable of extrapolating from observation and I have a track record
of being fairly right about it.

My comments are sincere but they are intended to stimulate, not to wound.
And if I am wrong at the end of the day, then I'll be embarrassed and glad
about it.

I am not seeking to "poison" this or any other group. The free exchange of
ideas (even where it is from "Trolls" who are seeking simply to "stir"
things) can only be beneficial to groups of people who have the intelligence
and vision to recognise what is important, and are capable of making their
own judgements on what is posted.

Unfortuately, the reasoning and observation behind my arguments is more
lengthy than can easily be accommodated in this particular forum. Also, my
comments are confined to commercial computer programming and not other
specialised areas of cyber development.(Like Chemical Engineering...<G>)

I had a look at your web site and see you are a proponent of Delphi and
PASCAL. (Both excellent languages and I have programmed in both of them,
although not extensively.) I guess this explains your reaction to my post.
It is not a comforting thought that the Languages we love have a limited
commercial lifetime, but that should not blind us to what is happening in
the Marketplace. (There are many COBOL programmers who are dismayed and
bewildered as they see the erosion of their traditional power base, too. My
advice has been to extend their skill set, but perhaps I should have said:
"Get an Accountancy or Business Management qualification...".)

The fact is that there are forces at work in the Marketplace that are
driving the "traditional" methods of developing commercial computer systems
into the ground. The Market wants computing "de-skilled" to the point where
end users can get the results they need without necessity for detailed
technical expertise. (My bet is that they will get it...). The Business
Functionality and the ability to support it in a rapidly changing
environment are paramount. Tools and Methods are emerging that have the
capability to deliver this within a reasonable (say, 15 years...) timeframe.

I respect your right to disagree, but I maintain my position.

Pete.

TOP POST - nothing further below here.

"Marco van de Voort" <ma****@stack.nl> wrote in message
news:sl*******************@toad.stack.nl...
In article <3f********@news.athenanews.com>, Peter E.C. Dashwood wrote:

On the contrary, specially for these kinds of users, sequential jobs are a way of thinking that is normal to them.

Well, Marco, I wonder how long it is since you looked?


Well, I think I went to work today.
Software tools are already emerging that substitute iteration and
interaction for sequential processes.


Sure, for certain limited domains, the actual engineering is done,
and there is a nice tool to customize that in several ways.

Useful? Certainly. Timesaver? sure. Potential to be universal? No way.
SQL Server for example has a "drag and drop" tool that allows processing
streams to be built in minutes.


I've used laboratory controlling software which was nice in a lot of ways
too. It was truely useful, productive and IMHO the most important,
it avoided a lot of errors.

But I would never claim that such a thing could be generalized and replace
software engineering. They are problem-domain specific solutions, nothing
more.
These same streams using procedural code would take days.


Sure. But that doesn't spell the end of procedural programming.
What's more, if you get it wrong you can simply go to a graphic interface
and change it.


I do that with sequential programming (Delphi) too. Tools for a specific
domain. It saves time (and equally important) makes the product somewhat
easier to maintain.
I have seen at least one Graphic design package that uses a
similar principle. Non computer literate designers can easily manipulate
these tools, interact with them, iterate their processes, until they
achieve what they want.


Within very simple, limited borders. There is not one such tool that
replaces general programming (regardless of which paradigm you use)
Programming knowledge is NOT a requirement.


It is. Those environments are extended using normal programming, tackling
large projects still needs skill.

Those tools are just that, extra tools. Pretty comparable to fancy
runtime (and later classes-) libraries. None of them spelled the end
of programming either.

Hmm, I think that is a good description. Some extra tools to aid software
development, and allow a _user_ some customization.
Currently, tools like this are in their infancy. In 15 years we can expect significant improvement.


Well. Then exactly this is our point of disagreement. Please explain
why you think (and how do you envision that will happen) how you get
from domain-specific solutions (math,laboratory handling, db handling)
to general purpose programming.

This is also what annoyed me about your previous message. It is a message
of a believer. There is no reasoning behind it. You only "get" it when
you are a believer.
increased also. I sometimes doubt if increased computer literacy actually
kept
up with the added computer tasks for the avg person.


The computer skills of the Business are rising very rapidly.


IT skills in using their own applications. Not in programming and
customization. In that department it got worse, especially in companies

with highly skilled (non-IT) technical people.

In the old days any beta-sciences student or graduate could do some
general purpose programming, usually they learned it so they could
program calculations. Nowadays they use mathlab, which is
absolutely great, but combined with faster machines requires less
programming skill to achieve the same result.
Which is good for them, but not for the level of programming knowledge
in a company.
Yes, they want smug buzzword talking con-men to take advantage of them
? No, they're getting pretty wise to that one too...in fact, most of us are.
If that were the case, I wouldn't get +/- 50-100 spams a day.

Exactly. So as long as my solution is good, and I can justify using a language,
waht is the problem.

It isn't enough just to provide a solution; it has to be an acceptable
solution.

That means using tools and methods the users are comfortable with.


I don't see why that would be the case? Its like a car mechanic who has

to fix a car with the tools an average person has in his house.
In 15 years they WON'T be comfortable with some old academic cobbling code together for a solution... By then they will have bypassed the need for
coding and will be implementing their own solutions.
Amen:-) Still a little thin on reasons though.
That was the whole point of my argument. They are doing it already... More and more Business departments are gaining enough computer literacy to
build their own systems using standard solutions like spreadsheets and
databases.


That kind of limited hobbying always has happened.
The last place I worked (a major utility in the Midlands of England) there were more people in the Business with Computer Science degrees, than I had on my IT staff.


Hmm. My former employer, which was IT related, had more chemists

(including me) than people with CS degrees.

There is no problem. I never suggested there was one. You can go ahead and use procedural code for the rest of your natural life. (I intend to...). You just won't make a lot of money at it. It'll become a "cottage industry" by 2020...<G>
A well. We have religious freedom :-)
software until they get what they want.

Sure. The telepathic kinds.


The process of iteration, as you would know if you had ever worked in a RAD
environment, does not require telepathy.


Very interesting. Why wouldn't I've worked in a RAD environment? Telepathy
again?
Your scorn is misplaced. Interaction and iteration enable HUMAN intelligence to get in the loop, but does NOT require specific technical (i.e.
programming) skill.


While I think in retrospect that my tone might have been misplaced,
your second post confirmed my suspicious. You have a firm believe in
something, and really want to advocate that. However I don't find
much evidence, not even shallow ones.

Except maybe that one story of a business department full of people
with CS degree's. Now that's surprising that they were more likely
to get something working using minimal and standard tools :-)
Programming is what requires the skill. Not the language. If you
studied
programming
closer, you'd know that.


LOL! While I note that you are at a very reputable University (apparently learning to use procedural code...)


I'm still honorary member of my former universities computer club.
I can assure you I have studied programming for the whole of my working
life (some 38 years - I started programming computers in 1965
What were YOU doing then <G>?) not behind
cloisters but in the real world.

Leaving aside your intended slight, I agree
that programming does require skill, but it was you who turned my statement into a separation between programming as a skill and programming as an art. I said that "Procedural Coding" is in decline. That includes the Language and the Art...


Yet, except a firm believe that ordinary users with a few standarised
tools will replace them, you don't reveal much reasons.
I wonder why your response is so vitriolic?
I didn't set out to attack you.


I've been on news for over a decade now. And on Fidonet before that. While
trolling and wild speculation presented as "truth" might seem innocent to
you, it doesn't to me. It poisons a group, and creates an unequal position
for discussion.

The problem is that you don't have to justify yourself in 15 years if you
are totally wrong, like you would in a company. It is nearly anonymous,

easy and safe, yet it still does damage.

(and btw, keep in mind nearly medium-longterm IT forecasts have been wrong
till now)

As said before Maybe I was to harsh, yet I still stand behind the original
intentions behind that message.
Could you be a little sensitive to the truth of what I'm saying?
Please don't degrade to amateur psychology. It's seriously flawed enough
already.
And where are you references for that. You don't even say what it is up
against, except some vague references about software which is going to
emerge as a winner in 2015 (and which I assume is telepathic, at least
if I
see your description)

The typical response of the student.


Again a belittling comment. Try to argue with something more substantial
arguments.
Are you saying that, without a
reference, you would question whether there is an accelerating rate of
change in computer technology?


No. I question if it goes in the direction that you say it is. So not
IF there will be change, just if it is going to be the change you

proclaim.
OK, Alvin Toffler, Moore's Law, and the fact that I have to get a new
computer every 18 months...
Relates to programming how?
As for my knowledge of the Market place, I have worked in industry IT
services all my life. It is axiomatic to me that the Business needs are
accelerating and greater flexibility in response to changing and new
Markets is required in IT today than was the case even 5 years ago. I
don't need a text book to tell me this; my users are drumming it into me
every day... I can SEE the need for flexibility in system design and
implementation.


Sure, but you practically argue that this will replace software

engineering.
Thank goodness there are tools and systems that are addressing this need. (Client/Server, distributed networks, OOD and OOP are all paradigms that
are much more flexible than the traditional mainframe Waterfall
methodology, and coincidentally, none of them is tied to Procedural
Coding...)


We'll see. I consider OOP to be procedural programming too btw.
My figures are based on a real case. The Company concerned sold their IT and leased it back. They did this when they had a bad year due to claims for
floods and droughts.


That's organisational detail.
It is interesting that in the "good" years they took no
action. Try telling a Board of Directors faced with a huge cash flow
requirement, that "Price is not the only point of competition". Even if
you're right (and I don't disagree with the statement) you will not help
your career...


Mine is practice too, everytime I argue on price, they come with "support"
(which they never use, and won't get), "security" (better large than
small company) etc.
award, with
"don't panic" in nice friendly letters.


Well, I always enjoyed the Hitchhikers Guide to the Galaxy, but I have never been a troll. You have no idea who you are dealing with <G>.


One of the joys of usenet:-)

Jul 19 '05 #115
Don't forget Java reflection. It is possible to pull out function/method
signature from binary.
Java and C# include reflection and therefore make them self-contained.
Both Java and C# binary file are targeted for virtual machine.

I think for reuse of code, OO based Java or C# code will be the first
option.

Thomas

"Roedy Green" <ro***@mindprod.com> wrote in message
news:lu********************************@4ax.com...
On 3 Aug 2003 19:02:14 -0700, ja***********@bindereng.com.au (James
Cameron) wrote or quoted :
I suspect something like C would be the best based
on comments


C is already disappearing. For longevity you want to pick something
that is popular, and that is rising in popularity.

Java is a pretty safe bet. Even if it dies the code is quite vanilla
and should be easy to port to whatever replaces it.

--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.

Jul 19 '05 #116
> The fact is that there are forces at work in the Marketplace that are
driving the "traditional" methods of developing commercial computer systems into the ground. The Market wants computing "de-skilled" to the point where end users can get the results they need without necessity for detailed
technical expertise. (My bet is that they will get it...). The Business
Functionality and the ability to support it in a rapidly changing
environment are paramount. Tools and Methods are emerging that have the
capability to deliver this within a reasonable (say, 15 years...) timeframe.
I respect your right to disagree, but I maintain my position.

Pete.

I agree those market forces are busy. But being quiet new in the traditional
Mainframe Cobol, SE business (6 years) and also programming in VB.NET(1
year). I believe that to achieve that goal will be very difficult. It means
users will need to be skilled at their usual job and also able to configure
their IT tools. I've worked in seven companies and I can't imagine them
doing that now. Perhaps in 15 years but it'll require a new approach on
training.
Btw, I've got that accountancy and business degree and moved on to IT (my
hobby since the days of the C64 and the Amiga). My last projects were not
maintenance and/or change projects, but completely new applications
developed in COBOL II running under CICS. The one I'm applying for within 2
hours is also completely new. The company is a world leader in it's
business.

Georgie.
Jul 19 '05 #117

"Georgie" <kr*****@yahoo.com> wrote in message
news:3f*********************@reader0.news.skynet.b e...
The fact is that there are forces at work in the Marketplace that are
driving the "traditional" methods of developing commercial computer systems
into the ground. The Market wants computing "de-skilled" to the point

where
end users can get the results they need without necessity for detailed
technical expertise. (My bet is that they will get it...). The Business
Functionality and the ability to support it in a rapidly changing
environment are paramount. Tools and Methods are emerging that have the
capability to deliver this within a reasonable (say, 15 years...)

timeframe.

I respect your right to disagree, but I maintain my position.

Pete.

I agree those market forces are busy. But being quiet new in the

traditional Mainframe Cobol, SE business (6 years) and also programming in VB.NET(1
year). I believe that to achieve that goal will be very difficult.
Yes, it will. However, we have been working on it for nearly 50 years now...

(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills. It is
interesting to me that COBOL was one of the first attempts to achieve this,
with the Conference on Data Systems Languages in 1959 even foregoing
commercial advantage on the part of some of the contributors, for the
greater good of the Business community. This is why it rankles me so much
that the Language has since been hijacked by ANSI for commercial gain,
despite the protestations that this is a non-profit organization...Don't
start me...<G>)

There are indications that it CAN be achieved. However, you are correct that
it will be difficult and many Mainframe COBOL sites will be dragged kicking
and screaming into it (or will find themselves outsourced to India...) We
had some interesting threads recently in comp.lang.cobol where the reasons
for what I call "Fortress COBOL" were explored. Adoption of new technology
is probably hardest on the mainframe sites, where there is a very long
tradition of doing things a certain way. (The fact that this way has NEVER
worked satisfactorily, has left Users disappointed and disheartened with IT,
and that there are now better ways, seems to be lost on some IT
departments...)

If you would like to see the background for my thoughts on this please take
this link:

www.aboutlegacycoding.com/Archives/V3/V30501.asp

It means
users will need to be skilled at their usual job and also able to configure their IT tools.
Well, I see it as our job (as IT professionals) to make sure the tools are
so user friendly, the Users can concentrate on their usual job without
having to become "computer programmers" (It would not be possible for the
expanding User base to all become computer programmers anyway, even if they
had the inclination to, which they don't...)

I guess what I'm saying is that if we do our job properly, we won't have a
job in 15 years...<G>

Actually, the nature of our work will change so that it isn't strictly
true, but the elements of truth are there...

I've worked in seven companies and I can't imagine them
doing that now. Perhaps in 15 years but it'll require a new approach on
training.
Yes, absolutely. There are some very innovative approaches on training and
self-education in the pipeline. The advent of technologies like DVD and the
interactive extensions of it will certainly change auto-education.
Btw, I've got that accountancy and business degree and moved on to IT (my
hobby since the days of the C64 and the Amiga). My last projects were not
maintenance and/or change projects, but completely new applications
developed in COBOL II running under CICS. The one I'm applying for within 2 hours is also completely new. The company is a world leader in it's
business.

Glad to hear you are productively employed and enjoying it, George. Hope it
stays that way for you. It looks like you have a "fall back" position
already established if it comes to it. A wise move...

Pete.
Jul 19 '05 #118
In article <3f********@news.athenanews.com>,
Peter E.C. Dashwood <da******@enternet.co.nz> wrote:

[snip]
(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills.


This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.

DD
Jul 19 '05 #119
do******@panix.com wrote:
In article <3f********@news.athenanews.com>,
Peter E.C. Dashwood <da******@enternet.co.nz> wrote:

[snip]
(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills.


This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.


And this immediately leads to the need of the DWI_S_M command. :-)

Jirka

Jul 19 '05 #120

"Karl Heinz Buchegger" <kb******@gascad.at> wrote in message
news:3F***************@gascad.at...


"Peter E.C. Dashwood" wrote:

To answer your very fair question (... and replaced by what?), I believe
that new methodologies for system development will arise in response to the pressure from the Marketplace. I have already seen interesting departures from traditional methods that achieved much faster results and were much
more flexible. The key to these approaches is a more RAD like process with iteration and interaction by users. Currently, we have programmers and
"Quick Build" tools in the loop, but it is only a matter of time before
smarter software will take on these functions. Eventually, end-users will interact with smart software to achieve what they want, and there will be no programmer in the loop at all.

There is far too much on this to go into here (sorry, I know that sounds
like a cop out, but I have been writing on this subject for some years now and have been using alternative approaches in the real world in industry
with results that are very encouraging.), but I will close by saying that everything I am saying is simply extrapolation from what is happening NOW. I claim no psychic powers, just good observation and a lifetime of experience in IT.
Do you have some links on that subject?


Here are just two articles which give some insight into the technology which
I believe will enable the Future I have been describing:

www.aboutlegacycoding.com/Archives/V3/V30501.asp (This is about what's wrong
with what we do now, in terms of development methodology...It was the
attempt to break out of this way of working which suggested to me how things
might work in the future.)

www.aboutlegacycoding.com/Archives/V3/V30201.asp (This is about the
direction I see programming technology going in. The emerging component
based systems will provide the basis for the User interaction I have been
postulating. Components are platform independent, small, with encapsulated
functionality and consistent and robust interfaces. These attributes are
just what is needed to respond to rapid change and provide flexibility.)

I'm sorry, these are links to just two of the articles I have written which
are pertinent. I don't normally promote my own work, but you did ask for
some links and these will at least give you an idea of where I'm coming
from.

I am personally interested in that subject. As you say: the old, traditional textual representations of procedural programming is something which requires skills, skills we cannot expect from ordinary users. I have thought about
some sort of graphical programming. The problem seems to be: It is relatively easy to come up with such a thing for a specific topic (eg. image manipulation), but as I see it, it is hard to generalize this to general programming.

Yes, that's right. There are models where we can achieve a general result if
we use a Human in the loop. (Basically, the Human is providing the
intelligence and discernment to decide whether a given proposed result is
acceptable or not.) By the key processes of ITERATION and INTERACTION, a
better and better solution can be arrived at. (If you like, the solution is
"evolved" towards...this varies markedly from the "traditional" IT approach
where the solution is "designed" from scratch, then built.)

Consider this...

Small Businessman goes to the "computer shop" and purchases a "small
business computer".

(Let's assume he is a very competent Businessman and knows his trade
extremely well, but he isn't big on computer technology, apart from the
basic "computer literacy" that children are growing up with today...).

He gets home, unpacks the computer, uses an interactive DVD (or similar) to
connect everything up, and switches on.

B'man: "Print me an invoice."
Computer: "What's an invoice?".
(This is a contrived example because the machine would certainly "know" what
an invoice is...but bear with me a little longer...)
B'man: "An invoice is a document that records the details of a sale
transaction."
Computer: "Ah, I know about Sales. That is a transaction where a CUSTOMER
purchases a PRODUCT."
(The machine comes equipped with the "concept" of a CUSTOMER and a PRODUCT,
among others. It also recognises the transactions we would expect to be
associated with a small business.)
Computer: "So you will want details of the CUSTOMER and the PRODUCT on
this INVOICE?"
B'Man: "Yes."
Computer: "Like this...?"
(It produces a document...)
B'Man: "Yes, but I need to see how many and what the unit price was.
Then calculate the total and add Sales Tax."
Computer: "How's this...?"
B'Man: "That's right. Put the totals for each product in a separate
column. And move the name and address details to here." (He indicates on the
screen with his finger or a pointing device.) "Print the overall total
payable in blue."
Computer: "Like this..?"
B'Man: "Yes, exactly like that."

There are some assumptions in the above whimsy (the computer has certain
inbuilt "concepts" - [you could think of this as a set of components with
all the attributes and Methods of a CUSTOMER, for instance], a natural
language interface, a "test" database for each of its concepts, however, all
of these things are possible with today's technology, and what is currently
"bleeding edge" will be passe in 15 years...), but I don't think anyone with
a programming background would say it was "impossible" or "infeasible".

Of course this demonstrates a solution to just one class of problem. There
are many others. But there are many other solutions also...

The bottom line is that "general" solutions ARE obtainable (note that in the
example above, we never know what particular "business" our "Businessman" is
in...the solution works for ALL small businesses, or can be "tailored"
easily to accommodate exceptions to the "norm".)

The keys to success in this are Interaction and Iteration. The Human
provides the "intelligence".(It is really "discernment"...)

But a time will come when the software will be capable of evaluating its own
results, matching them against stated requirements, rebuilding an entire
system in seconds to ensure that what is required is delivered. (Then doing
it all again, without complaint, when the User changes his mind <G>).

"Evolved" systems show every indication of being at least as good as
designed ones.

And they don't require computer skills on the part of the User.

Pete.
Jul 19 '05 #121

<do******@panix.com> wrote in message news:bh**********@panix1.panix.com...
In article <3f********@news.athenanews.com>,
Peter E.C. Dashwood <da******@enternet.co.nz> wrote:

[snip]
(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and designthe system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills.


This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.


<G> It certainly would, Doc, were it not for the Interaction and Iteration
which is fundamental to achieving it.
(note the inclusion of the word "explain"...this is an iterative and
interactive process, rather than a "Do what I want/mean" process...)

Pete.
Jul 19 '05 #122


"Peter E.C. Dashwood" wrote:
Do you have some links on that subject?
www.aboutlegacycoding.com/Archives/V3/V30501.asp
www.aboutlegacycoding.com/Archives/V3/V30201.asp

I'm sorry, these are links to just two of the articles I have written which
are pertinent. I don't normally promote my own work, but you did ask for
some links and these will at least give you an idea of where I'm coming
from.


No problem.
The links are fine.
[snip example]
There are some assumptions in the above whimsy (the computer has certain
inbuilt "concepts" - [you could think of this as a set of components with
all the attributes and Methods of a CUSTOMER, for instance], a natural
language interface, a "test" database for each of its concepts, however, all
of these things are possible with today's technology, and what is currently
"bleeding edge" will be passe in 15 years...), but I don't think anyone with
a programming background would say it was "impossible" or "infeasible".
That reminds me on 'SHRDLU' :-)

Of course this demonstrates a solution to just one class of problem. There
are many others. But there are many other solutions also...


Ever read D.Hoffstadter - Goedel, Escher, Bach (I guess you did)
The augumented semantical network and its size seems to be one of the
big problems with this.

Anyway: We have wandered way off topic, but you gave me a hint to leave
my small little programming world (which concentrates around 3D graphics)
and start looking over the fence again.

Thanks for sharing your thoughts.

--
Karl Heinz Buchegger
kb******@gascad.at
Jul 19 '05 #123
Joe Zitzelberger <jo**************@nospam.com> wrote in message news:<jo************************************@corp. supernews.com>...
In article <ff**************************@posting.google.com >,
ru**@webmail.co.za (goose) wrote:
and yet creating a std C program would not only get you that, it would also
get you a fairly snappy application *and* leave you open in the future
to be able to support those people who have machines that are not
capable of running java (certain designer palmtop-types) to *also*
interface with the campus machines.

java doesn't *buy* you anything extra in terms of portability.
The only relatively *portable* way I can think of is when writing
applets for web-pages (note: /relatively/). as long as the browser
has a java runtime environment, of course.

Java does have its advantages. Portability isn't one of them.
???Huh???

Which sort of Java isn't portable? I get binary compatibility on all
desktop/server/enterprise machines and many embedded as well.


a couple of things:
1. my std C programs has portability on *ALL* the platforms
that java has *AND* *MORE*.
2. name the embedded devices.

C compilers are available for all the platforms that java runs on,
but java is only available for platforms that are big enough to run
it.

bottom line: code written in java runs on less platforms than code
written in C.

next time, try thinking *before* you post !
If that
fails (it never has) I get source level compatibilily (the compiler is
written in java after all...) across all the platforms.
no you dont, you only got a few platforms. you can count the number
of java platforms without going into a 4 digit number ... miniscule.
C is ubiquitous enough to assume that if you got a digital machine,
you can get a C compiler for it.

if all you have is java experience, you have to look at each machine
and decide whether or not to use it for your project *before* you
even decide if it meets your *other* requirements.

Now it might not make any sense for me to try and open a dialog box on a
stoplight, and the stoplight manufacturer might well leave those
libraries out, but that hardly makes it non-portable.
tell me, how does one get as stupid as you did ?
write me a crc8 algorithm *in* *java* that I can reuse for the
stoplight!!!

I can write the same thing in C and use it on everything from
a stoplight to a cray.

All of this talks of applications, not applets which were a cute, but
useless toy.
sigh.

Have you ever actually tried porting an application to another
hardware/os using std C?
not only have I *done* (not "try") this before, I found that the
number of changes /needed/ to be done to get it going on the new platform
were few.

now, if you wanted to run java code on one of the many platforms that
cannot run java, you have to rewrite.
It is not just a recompile, there are plenty
of issues the original programmers must have planned for -- and they
usually don't.


so you *assume* they dont ???

your answer to me pointing out that java is rather limited
is "C programmers usually write badly" ???

you might double your IQ if you grew a second brain cell

goose,
insulting? surely you're too stupid to notice?
Jul 19 '05 #124
Just to split English hairs...

goose wrote:
<snip>

C compilers are available for all the platforms that java runs on,
but java is only available for platforms that are big enough to run
it.

bottom line: code written in java runs on less platforms than code
written in C.
Because of its demands, Java runs on /fewer/ platforms than code written
in C. Because of C's lower requirements it can run on lessor platforms
(less CPU and memory).

I think my 12th grade English teacher would be proud. ;-)


--
..tom
remove email address' dashes for replies
opensource middleware at <http://isectd.sourceforge.net>
http://gagne.homedns.org
Jul 19 '05 #125
goose wrote:

but java is only available for platforms that are big enough to run
it.


big like my mobile phone?

Jul 19 '05 #126
One of the comments that I have frequently made in response to Peter's posts
in comp.lang.cobol (and which I will now share with the other newsgroups on
this list) is that IF (and I have mixed opinions on this) the "future" of
business (particularly) computer programming moves more and more into
"component - mix-and-build / drag-and-drop type" end user application design
and DEVELOPMENT, that it will STILL require "someone" at the other end of
the new (and improved) tools doing the "lower-level" programming that
provides such tools (and components and libraries).

I can well imagine the current "trend" away from every medium-to-large
shop/business/company from having their own (semi-)large programming staff
to using a combination of off-the-shelf (but customizable) applications and
end-user available "tools". However, both of these require SOMEONE
somewhere designing/programming the tools themselves.

Whether this programming is done in OO, "procedural", waterfall, or whatever
type environments, is not something that my "crystal ball" tells me (yet).

If anything, my GUESS is that the "end users" of such tools will be LESS
tolerant of "iterative find a mistake and fix it" applications than users of
in-house developed applications. This means that I can see the "return" of
more strictly enforced "sign-off" of design, testing, development phases of
the TOOLS and components that are delivered to "purchasers".

--
Bill Klein
wmklein <at> ix.netcom.com
"Peter E.C. Dashwood" <da******@enternet.co.nz> wrote in message
news:3f********@news.athenanews.com...
Marco,

a good and fair response.

<much snippage>
Jul 19 '05 #127
In message <bh**********@panix1.panix.com>, do******@panix.com writes
In article <3f********@news.athenanews.com>,
Peter E.C. Dashwood <da******@enternet.co.nz> wrote:

[snip]
(The fundamental goal of commercial computing has been to have computers
that are capable of "understanding" Business needs and meeting them, in a
manner that would enable a Business User (or Users) to identify and design
the system and "explain" what is needed to the computer, in as simple a
manner as possible, without need for in depth technical skills.


This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.

DD


Unfortunately Doc, the DWIM command would not be adequate. I have worked
for users where they complained about what they were given because,
although it met their stated requirements and was exactly as they had
asked, it was not what they needed to do their jobs. How about a GWIN
command (gimme wot i need)?

--
Alistair Maclean
Jul 19 '05 #128
In article <vj************@corp.supernews.com>, Bat Guano wrote:
goose wrote:

but java is only available for platforms that are big enough to run
it.

Or that have hardware support
big like my mobile phone?


Is it 1 Mhz with 2k RAM?
Jul 19 '05 #129
Marco van de Voort wrote:
In article <vj************@corp.supernews.com>, Bat Guano wrote:
goose wrote:
but java is only available for platforms that are big enough to run
it.

Or that have hardware support

big like my mobile phone?

Is it 1 Mhz with 2k RAM?


I dont know, but it slips into my pocket easily

Jul 19 '05 #130
In article <3f********@news.athenanews.com>,
Peter E.C. Dashwood <da******@enternet.co.nz> wrote:

<do******@panix.com> wrote in message news:bh**********@panix1.panix.com...
In article <3f********@news.athenanews.com>,
Peter E.C. Dashwood <da******@enternet.co.nz> wrote:

[snip]
>(The fundamental goal of commercial computing has been to have computers
>that are capable of "understanding" Business needs and meeting them, in a
>manner that would enable a Business User (or Users) to identify and design
>the system and "explain" what is needed to the computer, in as simple a
>manner as possible, without need for in depth technical skills.


This, to me, screams for the implementation for the DWIM (Do What I Mean)
command.


<G> It certainly would, Doc, were it not for the Interaction and Iteration
which is fundamental to achieving it.
(note the inclusion of the word "explain"...this is an iterative and
interactive process, rather than a "Do what I want/mean" process...)


It would appear that if Meaning could be translated into action by a
single, simple DWIM command then that single command would constitute the
Interaction and no further Iterations would be necessary as what was Meant
would be Done.

DD

Jul 19 '05 #131
<do******@panix.com> wrote in message news:bh**********@panix1.panix.com...
In article <3f********@news.athenanews.com>,

<snip>
<G> It certainly would, Doc, were it not for the Interaction and Iterationwhich is fundamental to achieving it.
(note the inclusion of the word "explain"...this is an iterative and
interactive process, rather than a "Do what I want/mean" process...)


It would appear that if Meaning could be translated into action by a
single, simple DWIM command then that single command would constitute the
Interaction and no further Iterations would be necessary as what was Meant
would be Done.

DD


And on the 8th day, it was said,

"Let there be profits,
and lo,

all the business were profitable,
and all the applications worked as desired,
and all the user support personnel were helpful
and ..."

--
Bill Klein
wmklein <at> ix.netcom.com
Jul 19 '05 #132
In article <vj************@corp.supernews.com>, Bat Guano wrote:
goose wrote:

but java is only available for platforms that are big enough to run
it.


There is Waba which is a JVM for smaller machines such as Palm and CE PDAs.
There is even a version for MS-DOS, one for GameBoy, and a version for TI
Calculators.

(see http://www.wabasoft.com )

Jul 19 '05 #133

"William M. Klein" <wm*****@nospam.netcom.com> wrote in message
news:pv*******************@newsread1.news.atl.eart hlink.net...
One of the comments that I have frequently made in response to Peter's posts in comp.lang.cobol (and which I will now share with the other newsgroups on this list) is that IF (and I have mixed opinions on this) the "future" of
business (particularly) computer programming moves more and more into
"component - mix-and-build / drag-and-drop type" end user application design and DEVELOPMENT, that it will STILL require "someone" at the other end of
the new (and improved) tools doing the "lower-level" programming that
provides such tools (and components and libraries).

It is normal and natural for programmers to see things from a programming
perspective.

What I am suggesting is bigger than that.

I foresee a time when there will be no need for the "lower level"
programming you are talking about, Bill.

The job of building the tools will be complete. And "smart software" will do
the enhancements to it.

There are already specialised tools and wizards to do specific jobs.
(Network management, for instance). Programmers are not redeveloping these
tools; you could argue that they are "complete". They work and do the job
they are designed for. In fact, it would not be desirable to have
programmers "fiddling" with them.

One of the points often overlooked about component based systems is that
these components encapsulate EXPERTISE as well as functionality. I have
purchased components to provide specific functionality, initially to save me
the time of writing it myself, then found that the component provided
Methods I would never have dreamed of (in addition to the ones I needed),
because the programmer who wrote it had many years of EXPERTISE in that
particular area and was aware of things I couldn't be aware of unless I had
also spent years in a particular niche.

I only have one lifetime and I cannot be an expert in everything. Sooner or
later it is necessary to trust someone else's experience.

Your assumption that there will ALWAYS be a requirement for "low level"
programmers to keep maintaining tools is, at best, arguable, at worst, just
dead wrong.

I can well imagine the current "trend" away from every medium-to-large
shop/business/company from having their own (semi-)large programming staff
to using a combination of off-the-shelf (but customizable) applications and end-user available "tools". However, both of these require SOMEONE
somewhere designing/programming the tools themselves.

But not INDEFINITELY.
Whether this programming is done in OO, "procedural", waterfall, or whatever type environments, is not something that my "crystal ball" tells me (yet).

It really doesn't matter. Sooner or later, the job will be "done"...
If anything, my GUESS is that the "end users" of such tools will be LESS
tolerant of "iterative find a mistake and fix it" applications than users of in-house developed applications. This means that I can see the "return" of more strictly enforced "sign-off" of design, testing, development phases of the TOOLS and components that are delivered to "purchasers".


Well, your guess is wide of the mark, Bill. The iteration is required to get
closer to the goals, within the time allocated to do so (timebox). These ARE
"in-house developed applications". I have managed projects where we did this
(no guessing involved). It is foreign to the way you may have been used to
working, but it is VERY successful, PARTICULARLY with end users. (Why
wouldn't it be? They are involved in the process throughout and feel that it
is as much theirs as ITs. Users and IT people work together to achieve
specific goals within a specified time period. Programmers write code (or
select and drop components) towards the achievement of the common goal.
Users can see something taking shape as the process unfolds, and they are
able to ensure it meets CURRENT (today's) business requirements and balance
the priorities according to the Business needs.

On such a project, users and programmers discuss requirements together and
the programmers cut code. It is not such a big leap to have smart software
cut the code. I believe that is what will happen within the next 15 years.
It is just too expensive to maintain "multi-lingual" programming departments
in-house. Smart software will take on this role. Even if it isn't
"intelligent", this lack can be compensated for by keeping Humans in the
loop. These Humans will be end Users, not technicians.

Pete.
Jul 19 '05 #134

"Karl Heinz Buchegger" <kb******@gascad.at> wrote in message
news:3F**************@gascad.at...


"Peter E.C. Dashwood" wrote:
Do you have some links on that subject?
www.aboutlegacycoding.com/Archives/V3/V30501.asp
www.aboutlegacycoding.com/Archives/V3/V30201.asp

I'm sorry, these are links to just two of the articles I have written which are pertinent. I don't normally promote my own work, but you did ask for
some links and these will at least give you an idea of where I'm coming
from.


No problem.
The links are fine.

[snip example]

There are some assumptions in the above whimsy (the computer has certain
inbuilt "concepts" - [you could think of this as a set of components with all the attributes and Methods of a CUSTOMER, for instance], a natural
language interface, a "test" database for each of its concepts, however, all of these things are possible with today's technology, and what is currently "bleeding edge" will be passe in 15 years...), but I don't think anyone with a programming background would say it was "impossible" or "infeasible".


That reminds me on 'SHRDLU' :-)


Sorry, you lost me ..."SHRDLU"?

Of course this demonstrates a solution to just one class of problem. There are many others. But there are many other solutions also...


Ever read D.Hoffstadter - Goedel, Escher, Bach (I guess you did)


Nope, none of the above.

Everything I know, I learned on the shop floor. (In 38 years, even if I was
a slow learner (which I'm not...<G>), I'd have to pick up SOMETHING,
wouldn't I?)

I realise that there is a wealth of excellent work available now and I would
encourage young people to read it. (It's a bit late for me...<G>).

Try and understand that in 1965 not a lot was known about the theory of
computing, and what was known was not proliferated because it could mean
competitive advantage. Try to imagine a world WITHOUT standard "best
practices", installation standards, computing science courses, standard
algorithms even... We did things and got programs working. Next time we
wrote a program, we tried to make it "better" than the last one we wrote. (I
still do that to this day, but I realise there are thngs I could do better
that I am not going to change now...).
The augumented semantical network and its size seems to be one of the
big problems with this.
Anyway: We have wandered way off topic, but you gave me a hint to leave
my small little programming world (which concentrates around 3D graphics)
and start looking over the fence again.

Thanks for sharing your thoughts.


It is always a pleasure, and thank you for being interested.

Pete.
Jul 19 '05 #135
jce
"Peter E.C. Dashwood" <da******@enternet.co.nz> wrote in message
news:3f********@news.athenanews.com...
It is normal and natural for programmers to see things from a programming
perspective.

What I am suggesting is bigger than that.
I foresee a time when there will be no need for the "lower level"
programming you are talking about, Bill. What is a "lower level" programmer. The boundaries will shift but there
will still always be the lower level...maybe it's not a bit and byte
programmer but a trainer. It's still a form of programming. Even in a
sophisticated piece of adaptive software someone would have to provide the
learning environment - initially, until that role gets replaced and we move
along the escalator again. I still see this as a role of a "programmer"
more than an "end user" - though the lines may become blurry.
The job of building the tools will be complete. And "smart software" will do the enhancements to it. Don't forget the cost factor. It's still cheaper to pay a labourer 80c a
day to strip bark from wood for 10 hours than it is to maintain the
machinery to do it. Money will be the determining factor of what will and
won't happen - and soon after the class struggle. I assume the same to be
more true in IT - it already is happening....free overtime is cheaper than a
good toolset.
There are already specialised tools and wizards to do specific jobs.
(Network management, for instance). Programmers are not redeveloping these
tools; you could argue that they are "complete". They work and do the job
they are designed for. In fact, it would not be desirable to have
programmers "fiddling" with them.
Will the same TCP/IP still be used used in 15 years? Can we still use the
same network management tools when routers are obsolete, switches aren't
what they used to be.

Hardware *still* drives software. Plug n Play never saved the day....I
don't see tools fixing themselves to work with the hardware unless there is
another operating environment layer...made by...

Another important thing you don't consider is the power of the hobbyist and
the yearning that some people have to do things.
People could all drive automatic cars. But they still opt for manual.
People could ride in buses that drop them off at the exact destination, but
people like to drive and do. The business world isn't shielded from this.
Success of tools is still dependent on their usage. Unless people all jump
high onto a new set I think 15 years is too short for anything dominant to
come along and shape the IT world you envision. Linux didn't happen because
of Redhat or IBM but because of the underlying support from regular workers.
I only have one lifetime and I cannot be an expert in everything. Sooner or later it is necessary to trust someone else's experience.
Your assumption that there will ALWAYS be a requirement for "low level"
programmers to keep maintaining tools is, at best, arguable, at worst, just dead wrong. Not talking for Bill....maybe your view of "low level" is too narrow ;-)
Smart software will take on this role. Even if it isn't
"intelligent", this lack can be compensated for by keeping Humans in the
loop. These Humans will be end Users, not technicians.

What happens when the end User cannot get the results he wants...who's he
gonna call?

User: "Give me X"..
I can give you a kinda X
User: "Give me X"
I can give you a kinda X
User: "Give me X"
I can give you a kinda X
User: "Hey tech geek..can you get this thing to give me X?"
User: "Hey, where'd you go....hey tech geek...."
User: "Hellooo?..is anyone there...."
I can give you a kinda X, you still want it?

If I had to guess....Bill is too narrow and you're too wide....we'll get
something in between I'm sure (in 15 years!)

Last post on the matter - though I will no doubt keep reading :-)

JCE

btw: I hope you're a little wrong...because I still got to ride this out for
another few decades whilst you are relaxing on the shore somewhere.
Jul 19 '05 #136
Are you saying that in 20 years, a programmer wont have the tools to make
his own programming language, his own OS should he or she decide to? And
they call that progress? I call it going backwards here. If this is what
I'm gonna face in 20 years, I'll be making endless copies of DOS Linux and
maybe OS/2 so that I have the choice to do what I want. (note that I didn't
mention Windows ;-)

Here's my view of things, from my point of view, so you can't sue me for
saying this...hehehe.

We haven't even begun to touch the tip of the iceburg into what we can do as
far as software development goes. And while microsoft seems to be amazed by
it's Windows, Where it's been, where it is now, and where's it's going to
be, since about 3 to 4 thousand programmers went into the making of Windows,
I'm not impressed by those results. This having been said, So far, all
we've done for all these decades, is make the computer do that we dont want
to do. (Hefty calculations, any repetitive tasks, games (not for the same
reasons of course :-). But we haven't even begun to tap into the potential
that's ahead of us.

To me What you are suggesting is that we let the others come up with the new
stuff, give the users the ability adjust/change what the user did through
the use of somewhat flexible modules, and that's it for the programmer? I'm
thinking much longer term than that. After this step of yours happens, do
you really think that everything will have been made that can be made in the
whole computer industry? I beg to differ, as this approach the the future
of computing is one of many thousands of avenues, and I'm not saying there's
only that way out of it, even if this ever gets made, it wont close the door
to the rest of the potentials that still are, to date, untouched.

But that's my vision of it, once your implementation exists and is stable,
do you think the users, ever so evolving as you say (which I do have to
agree that they are) will stay contended with this? that they wont want
more? Give a man an inch, he'll take foot, etc etc etc....I dont seen that
human behavior stopping anytime soon. To stop that human behavior, we might
as well stop populating since after 5 billion people we can safely assume
we've conceived every possible kind of human being? not at all :-). far
from it. And the same goes for programming. Your view is one of many
parralel views, views that will all equally evolve, each in their own
specific ways, each towards very specific and unique goals. And as long as
they are computers, there will be programmers. And programming languages
that will range from low level to high level. The way Pascal is adjusting
to the current reality of development, I dont fear that it can adapt to any
new programming concept we can throw at it. It's been doign great at
adapting thus far.

Remember, software development is not a user only oriented concept. :-) at
least not in my book.

And that's my 0.02 cents worth :-)....(ok maybe there's a couple dollars in
there instead :-).

--
Stéphane Richard
Senior Software and Technology Supervisor
http://www.totalweb-inc.com
For all your hosting and related needs
"Peter E.C. Dashwood" <da******@enternet.co.nz> wrote in message
news:3f********@news.athenanews.com...

"Dr Engelbert Buxbaum" <en***************@hotmail.com> wrote in message
news:bh*************@news.t-online.com...
Paul Hsieh wrote:

COBOL and Pascal (the other groups you crossposted this message to)
will decrease in usage over time, not increase. There is absolutely
no new serious development being done in either language. In 15
years, Pascal will probably be completely dead, and the COBOL
community will be reduced even from the size of today's community
(human mortality alone will guarantee this.)
This may be true for COBOL, but Pascal is very much alive and kicking,
in the form of Delphi/Kylix. I am currently writing Kylix software, most
of the cutting edge routines (that do the real work rather than the user
interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now
with Borland going for cross-platform (Windozze/Unix) compatibility
there is no reason why Pascal should die in the foreseable future.


There are 400,000,000 reasons why ALL procedural languages (including

COBOL and PASCAL) should "die" in the not-too-distant future. (I don't know your
definition of "foreseeable" but mine is around 20 years...)

They are the number of people who access the internet every day. (For the
sake of this argument, I'll call them the "user base"...) They are not about to become "computer programmers". Instead, they will demand better
interfaces, smarter software, and MUCH better ways of developing computer
systems than sequential Von Neumann code. Most of them are "smarter" and
more "computer literate" than their prdecessors of even 10 years ago. They
are not intimidated by computer technology, will happily interact with smart software to achieve a result, and are not prepared to rely on and wait for, remote, faceless, technocrats to provide them with computer solutions to
business problems.

The Marketplace calls the shots.

We may have our own favourite Languages and we can poddle away in a corner
somewhere cutting code for the fun of it, but the real world demands that it get solutions. By 2015 a new generation of development software will see
"programmers" removed from the loop and end users interacting and iterating with smart software until they get what they want.

History will show that "computer programming" was a phenomenon of the latter half of the 20th Century. (OK, I'll concede a couple of decades of this
Century, MAYBE, it is early days yet... <G>)

Procedural code is already into Gotterdammerung. It takes too long, requires too much skill, is too inflexible (the accelerating rate of change in the
Marketplace and in technology is another reason why it is doomed to
extinction) and, overall, costs far too much.

For the last 5 decades Commerce and Industry (Scientific computing is a
horse of different colour, and is excluded from this discussion...) have had to put up with it because there was no other alternative. Now they don't.
The advent of packages that actually work, with re-usable components that
enable them to be easily tailored and deployed across different platforms
are making Corporate Board members query the amounts being spent on in-house IT. As fast as they get people trained they either leave or need new
skills... Why bother? Why should an Insurance company spend $50,000,000 a
year on in house IT when they could buy the service for $10,000,000? (Their Business is Insurance, not IT...expenditure of the scale we have seen
traditionally, can no longer be justified. You have to sell a lot of
policies to make $40,000,000...)

The only thing that COULD save procedural coding of solutions would be if
it priced itself back into the market. This MIGHT happen with offshore
outsourcing, but it is unlikely.

Bottom Line: Don't get smug about COBOL dying and PASCAL surviving; they are on the same parachute and the ground is coming up....

Pete.

Jul 19 '05 #137
Quoted: " SQL Server for example has a "drag and drop" tool that allows
processing streams to be built in minutes. These same streams using
procedural code would take days."

funny, me in 15 years, I dont see microsoft in the picture. ;-)

--
Stéphane Richard
Senior Software and Technology Supervisor
http://www.totalweb-inc.com
For all your hosting and related needs
"Peter E.C. Dashwood" <da******@enternet.co.nz> wrote in message
news:3f********@news.athenanews.com...

"Marco van de Voort" <ma****@toad.stack.nl> wrote in message
news:slrnbjhm1i.2gg0.ma****@toad.stack.nl...
In article <3f********@news.athenanews.com>, Peter E.C. Dashwood wrote:

"Dr Engelbert Buxbaum" <en***************@hotmail.com> wrote in message news:bh*************@news.t-online.com...
> Paul Hsieh wrote:
>
>
> > COBOL and Pascal (the other groups you crossposted this message to)
> > will decrease in usage over time, not increase. There is absolutely> > no new serious development being done in either language. In 15
> > years, Pascal will probably be completely dead, and the COBOL
> > community will be reduced even from the size of today's community
> > (human mortality alone will guarantee this.)
>
> This may be true for COBOL, but Pascal is very much alive and kicking,> in the form of Delphi/Kylix. I am currently writing Kylix software, most> of the cutting edge routines (that do the real work rather than the user> interface) are straight plug-ins of 15 year old Turbo-Pascal code. Now> with Borland going for cross-platform (Windozze/Unix) compatibility
> there is no reason why Pascal should die in the foreseable future.

There are 400,000,000 reasons why ALL procedural languages (including COBOL and PASCAL) should "die" in the not-too-distant future. (I don't know your definition of "foreseeable" but mine is around 20 years...)
Really? Please name and discuss them.

<G> I'm sure you know some of them personally...
They are the number of people who access the internet every day. (For the sake of this argument, I'll call them the "user base"...) They are not about to become "computer programmers".


Indeed.
Instead, they will demand better interfaces, smarter software,


True
and MUCH better ways of developing computer systems than sequential Von Neumann code.


On the contrary, specially for these kinds of users, sequential jobs are a
way of thinking that is normal to them.


Well, Marco, I wonder how long it is since you looked? Software tools are
already emerging that substitute iteration and interaction for sequential
processes. SQL Server for example has a "drag and drop" tool that allows
processing streams to be built in minutes. These same streams using
procedural code would take days. What's more, if you get it wrong you can
simply go to a graphic interface and change it. I have seen at least one
Graphic design package that uses a similar principle. Non computer

literate designers can easily manipulate these tools, interact with them, iterate
their processes, until they achieve what they want. Programming knowledge is NOT a requirement. Currently, tools like this are in their infancy. In 15
years we can expect significant improvement.

Most of them are "smarter" and more "computer literate" than their
prdecessors of even 10 years ago.
Yes. They are not scared anymore. OTOH the requirements on them have

severly
increased also. I sometimes doubt if increased computer literacy actually kept
up with the added computer tasks for the avg person.

The computer skills of the Business are rising very rapidly.

They are not intimidated by computer technology, will happily interact
with smart software to achieve a result, and are not prepared to rely on and wait for, remote, faceless, technocrats to provide them with computer solutions to business problems.


Yes, they want smug buzzword talking con-men to take advantage of them ?

No, they're getting pretty wise to that one too...in fact, most of us are.
We may have our own favourite Languages and we can poddle away in a corner somewhere cutting code for the fun of it, but the real world demands that it get solutions.


Exactly. So as long as my solution is good, and I can justify using a

language,
waht is the problem.

It isn't enough just to provide a solution; it has to be an acceptable
solution. That means using tools and methods the users are comfortable

with. In 15 years they WON'T be comfortable with some old academic cobbling code
together for a solution... By then they will have bypassed the need for
coding and will be implementing their own solutions. That was the whole
point of my argument. They are doing it already... More and more Business
departments are gaining enough computer literacy to build their own systems using standard solutions like spreadsheets and databases. The last place I
worked (a major utility in the Midlands of England) there were more people
in the Business with Computer Science degrees, than I had on my IT staff. My guys often hit problems incorporating user developed backyard systems into
the corporate IT strategy. But it improved, and with some guidance from us
they were doing pretty good stuff when I left in April.

There is no problem. I never suggested there was one. You can go ahead and
use procedural code for the rest of your natural life. (I intend to...). You just won't make a lot of money at it. It'll become a "cottage industry" by
2020...<G> The "boom" was in the late 70s to the early 80s. We wrote our own tickets. The advent of the PC and the dissemination of computer skill to
"everyone" killed the Golden Goose. Personally, I'm glad to see computer
technology being utilised and within the reach of most people, but I'd be
stupid to pretend it hasn't hit my pocket.
By 2015 a new generation of development software will see "programmers" removed from the loop and end users interacting and iterating with smart software until they get what they want.
Sure. The telepathic kinds.


The process of iteration, as you would know if you had ever worked in a

RAD environment, does not require telepathy. It does require good communication. Your scorn is misplaced. Interaction and iteration enable HUMAN intelligence to get in the loop, but does NOT require specific technical (i.e.
programming) skill.

Procedural code is already into Gotterdammerung.
It takes too long, requires too much skill,


Programming is what requires the skill. Not the language. If you studied

programming
closer, you'd know that.


LOL! While I note that you are at a very reputable University (apparently
learning to use procedural code...), I can assure you I have studied
programming for the whole of my working life (some 38 years - I started
programming computers in 1965 - What were YOU doing then <G>?), not behind
cloisters but in the real world. Leaving aside your intended slight, I

agree that programming does require skill, but it was you who turned my statement into a separation between programming as a skill and programming as an art. I said that "Procedural Coding" is in decline. That includes the Language
and the Art...

I wonder why your response is so vitriolic? I didn't set out to attack you. Could you be a little sensitive to the truth of what I'm saying?
is too inflexible (the accelerating rate of change in the Marketplace and in technology is another reason why it is doomed to extinction) and,
overall, costs far too much.
And where are you references for that. You don't even say what it is up
against, except some vague references about software which is going to
emerge as a winner in 2015 (and which I assume is telepathic, at least if I
see your description)
The typical response of the student. Are you saying that, without a
reference, you would question whether there is an accelerating rate of
change in computer technology?

OK, Alvin Toffler, Moore's Law, and the fact that I have to get a new
computer every 18 months...

As for my knowledge of the Market place, I have worked in industry IT
services all my life. It is axiomatic to me that the Business needs are
accelerating and greater flexibility in response to changing and new

Markets is required in IT today than was the case even 5 years ago. I don't need a
text book to tell me this; my users are drumming it into me every day... I
can SEE the need for flexibility in system design and implementation. Thank goodness there are tools and systems that are addressing this need.
(Client/Server, distributed networks, OOD and OOP are all paradigms that are much more flexible than the traditional mainframe Waterfall methodology, and coincidentally, none of them is tied to Procedural Coding...)

skills... Why bother? Why should an Insurance company spend
$50,000,000
a year on in house IT when they could buy the service for $10,000,000?
Ah, but could they, and with the same secondary securities? Price is not

the only
point of competition.


My figures are based on a real case. The Company concerned sold their IT

and leased it back. They did this when they had a bad year due to claims for
floods and droughts. It is interesting that in the "good" years they took no action. Try telling a Board of Directors faced with a huge cash flow
requirement, that "Price is not the only point of competition". Even if
you're right (and I don't disagree with the statement) you will not help
your career...
The only thing that COULD save procedural coding of solutions would
be
if it priced itself back into the market. This MIGHT happen with offshore
outsourcing, but it is unlikely.

Bottom Line: Don't get smug about COBOL dying and PASCAL surviving;
they
are on the same parachute and the ground is coming up....
Bottom Line: I think we can safely award you the "troll of the week"

award, with
"don't panic" in nice friendly letters.


Well, I always enjoyed the Hitchhikers Guide to the Galaxy, but I have

never been a troll. You have no idea who you are dealing with <G>.

Pete.

Jul 19 '05 #138
On 13 Aug 2003 09:22:24 -0700, ru**@webmail.co.za (goose) wrote or
quoted :
bottom line: code written in java runs on less platforms than code
written in C.


Bottom line is the odds of a Java app running correctly without
modifications are much higher than a C program. C programs don't
nearly strictly enough specify the desired behaviour. To make code
that runs on many platforms you have to create a quite extensive set
of macros, one library for each platform.

Even the size of an int is not nailed down for heaven sake.

--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Jul 19 '05 #139


Roedy Green wrote:
On 13 Aug 2003 09:22:24 -0700, ru**@webmail.co.za (goose) wrote or
quoted :
bottom line: code written in java runs on less platforms than code
written in C.


Bottom line is the odds of a Java app running correctly without
modifications are much higher than a C program. C programs don't
nearly strictly enough specify the desired behaviour. To make code
that runs on many platforms you have to create a quite extensive set
of macros, one library for each platform.

Even the size of an int is not nailed down for heaven sake.


Roedy,

Haven't been there but lived real close in Guildford at one stage -
reading you C and Java people, I feel like a Wimbledon spectator with my
head zinging from left to right as the opponents take a swipe at the
ball !

For the uninitiated it really is difficult to balance the truth between
your opposing camps. Is there anywhere, but anywhere, where the observer
can get a reasonably unbiased balanced view of pros and cons per
language ?

Of course I could suggest if you have a real problem, use OO COBOL <G>.

BTW - took a very quick look at your Java Glossary, and noted your
reference to lack of FIFO and LIFO in Java lists. Surely that can't be a
big deal, possibly cloning your own list class. Although
collections(lists) are included in both Fujitsu and Micro Focus versions
of OO COBOL - our J4 Standards Committee currently has collections as an
on-going topic at the moment. I doubt we'll finish up with a collection
specifically geared to FIFO/LIFO. I can handle it quite easily at
present from either an Ordered or SortedCollection :-

*>FIFO
move 1 to MyIndex
invoke MyCollection "at" using MyIndex returning anElement

*>LIFO
invoke MyCollection "size" returning lsSize
*> above gives total elements
*> then I can do either of the following :-

invoke MyCollection "at" using lsSize returning anElement
*> OR
move lsSize to MyIndex
invoke MyCollection "at" using MyIndex returning anElement

If you haven't got what you want - James Gosling's fault. (He was born
in Calgary).
Guess he should have checked the Smalltalk hierarchy more closely before
he sat down to re-invent the wheel <G>.

I might add I can invoke both C and Java, with COBOL classes written to
support invoking Java. I have no need at the moment as I have rich
support of collections and GUIs built into the product.

One comment that came up here in this thread early on was "Use the right
tools for the job", not necessarily those exact words, but a point made
often in the COBOL group. Somebody of course nodded sagely at this pearl
of wisdom. Not always, but more often than not, that phrase translates
to "Use the free or cheapest tools you can get to do the job". Can't
knock people for that attitude, but I do wish they would come on in an
'honest' mode.

"Using the right tool" - here's one that came up recently from Brazil
in my Tech Support group. "How can I emulate an on-line Help file where
you key in some characters and then the entry is highlighted in the
Listbox ?".

Quite naturally a support person suggested, "Go to this site
www.xxxxx.com and check out their OCX". I thought, "I betcha that's
possible in COBOL". It is. It was a piece of cake. Micro Focus has
values for Windows events, and it looked like some four were
possibilities. All I had to do was a quick test of the four to get the
one which would immediately trigger an event based on a single
keystroke.

Problem solved ! Having done that as an interest exercise, I can already
see where it can be RE-USED in real applications.

With so many COBOLers using old, effective and established (mainfrme)
compilers, without any OO, naturally there's a whole daffy of people who
automatically address problems through C or Java, or whatever.

Note, none of the above has anything to do with the proselytizing of
components by Pete. Dashwood - I'm talking about REALLY using OO COBOL !

Jimmy, Calgary AB
Jul 19 '05 #140
Bat Guano <bat.guano@talk21dotcom> wrote in message news:<vj************@corp.supernews.com>...
goose wrote:

but java is only available for platforms that are big enough to run
it.


big like my mobile phone?


whats your point ? that java rnus on your mobile phone ?

<NEWS FLASH> C probably targets that too </NEWS FLASH>

and it also targets many that java does not run on ?

so what exactly *is* your point ? java runs on a *fraction*
of platforms that C targets.
does your mobile have under a K of ram ?

thought not

goose,
java code isn't as portable as C code.
Jul 19 '05 #141
Richard Plinston <ri****@Azonic.co.nz> wrote in message news:<bh**********@aklobs.kc.net.nz>...
In article <vj************@corp.supernews.com>, Bat Guano wrote:
goose wrote:

but java is only available for platforms that are big enough to run
it.


There is Waba which is a JVM for smaller machines such as Palm and CE PDAs.
There is even a version for MS-DOS, one for GameBoy, and a version for TI
Calculators.

(see http://www.wabasoft.com )


you miss the point. not only does C also target all the platforms
that java does, it targets a whole lot more as well.

goose,
following the one true brace style :-)
Jul 19 '05 #142
On Thu, 14 Aug 2003 05:33:07 GMT, "James J. Gavan" <jj*****@shaw.ca>
wrote or quoted :

Haven't been there but lived real close in Guildford at one stage -
reading you C and Java people, I feel like a Wimbledon spectator with my
head zinging from left to right as the opponents take a swipe at the
ball !


If you want to write cross platform code in C++ you have to buy a
rather expensive cross platform library, something like Rogue Wave.
C++ or C out the box is far from being cross platform. What you can
do is write code that can be tweaked to run on another platform, or
you can create a set of macros to make the same code base run on one
or two platforms. However the C and C++ are not naturally
multiplatform. They are lower level languages. That is why they are
used for JNI when you want to get personal with the OS.

C is much more cross platform in the Unix world, but when you throw in
Windows and the Mac, you pretty well have to start from scratch. With
C and C++ you interface directly with the OS. With Java there is a
buffering layer of standard class libraries.

The language wars are for the most part kid nya nya games. You need
both C++ and Java for different things. There are not many places
where you would you would use pure C now instead of C++, perhaps a
device driver or some code to fit in an embedded device.

It is like arguing which is better a wrench or a screwdriver.

--
Canadian Mind Products, Roedy Green.
Coaching, problem solving, economical contract programming.
See http://mindprod.com/jgloss/jgloss.html for The Java Glossary.
Jul 19 '05 #143
Roedy Green <ro***@mindprod.com> wrote in message news:<ro********************************@4ax.com>. ..
On Thu, 14 Aug 2003 05:33:07 GMT, "James J. Gavan" <jj*****@shaw.ca>
wrote or quoted :

Haven't been there but lived real close in Guildford at one stage -
reading you C and Java people, I feel like a Wimbledon spectator with my
head zinging from left to right as the opponents take a swipe at the
ball !
If you want to write cross platform code in C++ you have to buy a
rather expensive cross platform library,


thats a lie, gtk is cross-platform *and* free.
something like Rogue Wave.
C++ or C out the box is far from being cross platform.
thats a lie. I can write a non-trivial program that will compile
with any C (where the hell did i mention C++?) compiler out-the-box,
and compile on all c89/90 compilers.
What you can
do is write code that can be tweaked to run on another platform, or
you can create a set of macros to make the same code base run on one
or two platforms.

you can do that *also*, but its not your only option.
However the C and C++ are not naturally
multiplatform.
thats another lie, C was standardised, afaik, *because*
of its ubiquity.
They are lower level languages. That is why they are
used for JNI when you want to get personal with the OS.
and break what little portability that java offers ?

C is much more cross platform in the Unix world,
thats a lie again. C is cross-platform.
period.
but when you throw in
Windows and the Mac, you pretty well have to start from scratch.
yet another lie. gtk compiles on both platforms, afaik.
With
C and C++ you interface directly with the OS.
yet another lie. show me even *one* std C function that
binds your code directly to the OS.

just one.
With Java there is a
buffering layer of standard class libraries.
its strange that you nkow so little yet you post so much.
what do you do for a living anyway ?
lawyer ?

The language wars are for the most part kid nya nya games.
I certainly agree. however it does irk me that you so blatantly *lie*
so *many* times in a single post.
You need
both C++ and Java for different things.
I have not mentioned C++
There are not many places
where you would you would use pure C now instead of C++, perhaps a
device driver or some code to fit in an embedded device.
true, but if you wanted to write a crc algorithm that worked
*everywhere*, you'd write in in pure C.

It is like arguing which is better a wrench or a screwdriver.

no no, it is me pointing out that
a) java is not platform independant, it depends on the java
platform.
b) the java platform only runs on a fraction (i dunno, say 1%?)
of computing machines out there.
c) There is a C compiler for just about every computing
machine in the world. I'd certainly be interested to here
which machine supports java but not C.

You are destroying your credibility with so many lies in a single
post.

If you *really* thought those things, well, then ... now you
know more (e.g. you know that std C is not tied to the OS).

but if you were merely ignorant and neglected to look up the
facts, then I'd have to say that you should do your bloody
homework instead of making an ass of yourself.

goose,
at least now you are more informed, are you not ? no more *myths*
about C ?
Jul 19 '05 #144
goose wrote:
Even the size of an int is not nailed down for heaven sake.


yes it is, read the std.


"At least 16 bits" doesn't count as "nailed down" for me.

--
Chris "electric hedgehog" Dollin
C FAQs at: http://www.faqs.org/faqs/by-newsgrou...mp.lang.c.html
C welcome: http://www.angelfire.com/ms3/bchambl...me_to_clc.html
Jul 19 '05 #145
WB
Karl Heinz Buchegger wrote:
Do you have some links on that subject?


http://mindstorms.lego.com/eng/products/ris/rissoft.asp

You use a GUI to drag/drop programming elements. They stick together to
create a behaviour for your robot. If/Else/Loops are all supported.

Oh, just to throw fuel on the fire, there IS a Java VM available. Do a
Google search on: lego robot java

There is also a C like language called NQC (Not Quite C).

:-))
Jul 19 '05 #146
jce
><NEWS FLASH> C probably targets that too </NEWS FLASH>

<SIGH>We *get* your point </SIGH>
goose,
java code isn't as portable as C code.


By the way, I would say that Java on a wireless phone or PDA is more
portable than a stoplight.
I don't believe I could get the latter through customs.

JCE
Jul 19 '05 #147
In article <bh**********@panix1.panix.com>, <do******@panix.com> wrote:

If the statements in question are the result of mere ignorance then they
lack the intentionality which is required of a lie... or am I missing
something?


You seem to be missing the fact that it was a flame. Logic and correct
use of vocabulary are non-issues in that context.

Cheers
Bent D
--
Bent Dalager - bc*@pvv.org - http://www.pvv.org/~bcd
powered by emacs
Jul 19 '05 #148

On 14-Aug-2003, ru**@webmail.co.za (goose) wrote:
the bottom line is that
(now read this slowly, so that it sinks in)
*code* *written* *in* *java* *will* *be* *portable* *to*
*fewer* *platforms* *than* *code* *written* *in* *C*.
Programs as written will run without change when moved to a different platform?
do you argue that ??? you'd be incredibly stupid and/or
brain damaged to argue that point.


There are other options besides your insulting ones. But when someone is into
that type of insulting, whether he has anything useful to contribute gets
overlooked.
Jul 19 '05 #149

On 13-Aug-2003, "Peter E.C. Dashwood" <da******@enternet.co.nz> wrote:
I foresee a time when there will be no need for the "lower level"
programming you are talking about, Bill.


We will no longer need to program by moving wires around?

Or we will no longer need to program in machine language?

Or we will no longer need to program using assembler?
Been there, done that.
Jul 19 '05 #150

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

253
by: James Cameron | last post by:
Hi I'm developing a program and the client is worried about future reuse of the code. Say 5, 10, 15 years down the road. This will be a major factor in selecting the development language. Any...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.