473,394 Members | 1,869 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

Future reuse of code

Hi I'm developing a program and the client is worried about future
reuse of the code. Say 5, 10, 15 years down the road. This will be a
major factor in selecting the development language. Any comments on
past experience, research articles, comments on the matter would be
much appreciated. I suspect something like C would be the best based
on comments I received from the VB news group.

Thanks for the help in advance

James Cameron
Jul 19 '05
242 13079
Tor Iver Wilhelmsen wrote:
GCC the compiler is pointless without libraries,
No, that is not true. On Unix things like open() read() are _system_ calls
not library routines. It is possible to write an application in C without
any libraries at all. In fact most of the libraries are written in C and
thus only require to be coded in the program to give that functionality.
and libraries in C/C++ differ between platforms.
Well of course the standard libraries are _implemented_ differently, but
they provide the same programmer interface and act as a layer between the C
program and the 'platform' specifics.

For gcc the libraries give the same implementation on all platforms they
are implemented on - in fact gcc and its libraries probably run on naerly
every platform that full Java does, and a couple that don't have a full
Java, such as MS-DOS/EMX.

One could also argue that the Java VM 'differes between platform' in that
the VM for one environment won't work on another. In that case the Java VM
is acting in the same way as the C libraries.
Java's libraries let you write complex applications without a million
#ifdef _LINUX_ or whatever.


Your not very widely read on this are you: - never let facts get in the way
of a good prejudice ?

It may be true that _most_ implementations of Sun's JVM at the same version
will run the same program unchanged, it may not be true that, say, a JVM
1.2 will run all Java 1.4 programs or that a servlet will run as an applet.
There are also dozens of JVMs out there which have subtle differences which
may need to be coded around: MS J+, Kaffe, Latte, JanosVM, Alta, even
Blackdown.

These provide specialised environments or enhancements or are for platforms
not catered for by Sun.

All that you are saying is that if one only works within a very limited set
of Java, say the standard VM from Sun of a given version then all programs
will run unchanged.

One could do that for C/C++ too. One never needs an #ifdef if one only
uses those wnvironments that never need an #ifdef.

In any case many of the JVMs are written mostly in C.

Jul 19 '05 #201
"Roedy Green" <ro***@mindprod.com> wrote in message
news:3v********************************@4ax.com...
On Fri, 15 Aug 2003 01:21:55 GMT, Kevin Easton
<kevin@-nospam-pcug.org.au> wrote or quoted :
It is impossible to
write incorrect Java code?


The difference in, that equivalent program in Java would either work
or not work. It would give the same results on all platforms. With
the C version, you don't know if it works on other platforms until you
test it.


This is getting silly. You do not know if *any* program in *any* language
on *any* computer works until you test it.

Donald
Jul 19 '05 #202
In comp.lang.c Roedy Green <ro***@mindprod.com> wrote:
On Fri, 15 Aug 2003 01:21:55 GMT, Kevin Easton
<kevin@-nospam-pcug.org.au> wrote or quoted :
It is impossible to
write incorrect Java code?

The difference in, that equivalent program in Java would either work
or not work. It would give the same results on all platforms. With
the C version, you don't know if it works on other platforms until you
test it.


Subject to the run-time environment being thoroughly tested to work
the same on all the potential platforms. Not unlike a conforming C
compiler, wouldn't you say?

The bottom line is, you cannot expect 1-to-1 conformance between
implementations. You should, therefore, always test your programs
on all target platforms. This applies to Java just as much as it
does to C.

Alex
Jul 19 '05 #203
On Fri, 15 Aug 2003 00:59:41 GMT, in comp.lang.c , Roedy Green
<ro***@mindprod.com> wrote:
On Thu, 14 Aug 2003 23:02:16 GMT, Kevin Easton
<kevin@-nospam-pcug.org.au> wrote or quoted :
People who *think* they need an exact-width type, rather than an
at-least width type, are usually wrong.
But then people write code thinking of only their own platform where
int is say 32 bits, and hand it to someone else whose int is 16 bits.
It does not work.


well yes, but thats hardly surprising, since they're bad programmers.
It requires foresight
no, it requires an understanding of the limits of the size of an int.
and a macro to make that
code work on both platforms. It is thus foolish to claim C or C++
works naturally multiplatform. It requires extra effort.


it requires knowledge.
--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.angelfire.com/ms3/bchambless0/welcome_to_clc.html>
----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Jul 19 '05 #204
On Fri, 15 Aug 2003 01:49:11 GMT, in comp.lang.c , Roedy Green
<ro***@mindprod.com> wrote:
On Fri, 15 Aug 2003 01:43:18 GMT, Kevin Easton
<kevin@-nospam-pcug.org.au> wrote or quoted :
Is it possible to write non-portable code in C? Yes.

Does it follow from this that it is impossible to write portable code in
C? No.
Straw man argument Kevin.

You made the silly assertion that ALL C code would run unmodified on
any platform correctly. That is nonsense.


If you insert the word conforming, its correct.
"All conforming C code will run on any conforming platform correctly".

I need hardly point out thatr *precisely* the same is true of Java,
C++ or for that matter DCL. If you rely on nonstandard features of
your specific implementation, such as the size of an int, or some
class thats not part of the standard, then you're screwed.
I merely stated that writing cross platform code in C or C++ takes
considerable effort.
This remark is flat out wrong.

I've written hundreds of thousands of lines of C and a few tens of Ks
of C++, all of which was completely portable between VMS, Solaris and
NT, and it required NO special effort, merely sticking to the language
spec. The only time I had a problem was when faced with a pre-ansi
compiler on the Sun box.
You need to find third party
libraries supported on all your platforms.
Only if you want to do things not part of the standard implementation.
Just as, in fact, you would need to do with Java, if you wanted, say,
to access vacuum tubes or drive a stepping motor.
You need to generate separate executables.
You do with Java too - each implementation has a separate executable
called the Java Runtime.
You need to test separately, and you need a ton of compile time
macros to make the magic work.


Bollocks. Utter and total bollocks. Stop trolling.

--
Mark McIntyre
CLC FAQ <http://www.eskimo.com/~scs/C-faq/top.html>
CLC readme: <http://www.angelfire.com/ms3/bchambless0/welcome_to_clc.html>
----== Posted via Newsfeed.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeed.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= 19 East/West-Coast Specialized Servers - Total Privacy via Encryption =---
Jul 19 '05 #205
On Fri, 15 Aug 2003 20:28:48 GMT
WB <su*****@bossi.com> wrote:
Kevin Easton wrote:
In comp.lang.c WB <su*****@bossi.com> wrote:
Roedy Green wrote:

Don't be silly. Look at any C code designed to run on many
platforms. It is riddled with macros to pull off the feat.

You can do a few simple things like parse the command line, open a
flat file in a platform independent way, but not even the size of
int is guaranteed unless you play some games with macros.

Well, no you can't open a file in a generic way. Opening a file on a
PC is different than opening a file on a mainframe. You need a
special library to do this:

#ifdef I370
#include <lcio.h>
#endif

Which works with the SAS C compiler.

Plus, all your variables and function names need to be 8 characters
or less, AND mixed case is ignored, so VarA and VARA are the same,
thanks to the mainframe linker.

At least it was a few years ago. Things may have chnaged since
then....

If it doesn't support fopen, then it's not a hosted C
implementation.


Who said anything about fopen? You still use fopen, but MUST have the
additional include file, which adds support for the mainframe way of
doing things.


In which case it is *not* a conforming hosted C implementation. To be
conforming the inclusion of stdio.h to provide a definition of the FILE
type is entirely sufficient. Of course, you might need to use something
else to deal with directories etc.
> If it doesn't handle identifiers case-sensitively, then it's not a
> conforming C implementation at all.


The mainframe C compiler is quite happy with long var names and mixed
case. It is the linker which falls over. Actually is only "sees" up to
8 characters, but does not complain at all. This leads to very
interesting behaviour :-((


From memory I think the 8 character limit is legal but I'm not sure
about the case sensitivity. I always avoid having names only
distinguished by case, so it has never been an issue for me.
Oh yes, I also forgot about tri-graphs:
[ becomes ??(
] becomes ??)
^ becomes ??\
~ becomes ??-
Trigraphs are the other way around. You put "??(" in the source and it
is replaced by "[".
I ended up writing a converter which takes the original C code, and
converts it to something that will run on the mainframe. Yuck.


Sounds like the mainframe did not have a conforming C implementation to
me.
--
Mark Gordon
Jul 19 '05 #206
On Fri, 15 Aug 2003 14:27:31 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
In article <20******************************@flash-gordon.me.uk>, Mark
Gordon wrote:
Marco van de Voort <ma****@stack.nl> wrote:

Definitely.
> Platform independent code happens naturally with almost no effort

in> > Java. It is requires considerable effort in C++. You can't
> convince me otherwise because I have done both.

Me too for the opposite :-)


I've not done C++,


Only a bit. (pretty much the amount needed for Delphi compability.
Basic classes and such)
but I have done Pascal (had to use non-portable
features)


Like?


Trapping the "STOP" key on an HP workstation. Only used this on the HP.

Separately compiled files (has that been added to the standard?) Handled
differently on the different Pascal implementations I used.

A non-standard way to access third party libraries. Handled differently
on the different Pascal implementations I used.

Probably a number of other things that I can't remember now.

I think the Delphi diverged even further, although I helped people with
Delphi rather than using it myself.
--
Mark Gordon
Jul 19 '05 #207
[Please observe followup to comp.lang.c; thanks.]

Mark McIntyre wrote:
Its impossible to write a "strictly compforming" programme.


int main(void)
{
return 0;
}

--
Richard Heathfield : bi****@eton.powernet.co.uk
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton
Jul 19 '05 #208
[Please observe followups to comp.lang.c; thanks.]

Mark McIntyre wrote:

<snip>

"All conforming C code will run on any conforming platform correctly".


Counter-example:

void main(void)
{
}

This program is conforming (even though it invokes undefined behaviour),
because at least one conforming implementation accepts it. On the other
hand, it's not difficult to find at least one conforming implementation
that rejects it.

--
Richard Heathfield : bi****@eton.powernet.co.uk
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton
Jul 19 '05 #209
Apologies in advance guys, but this is a long winded one - to do with the
application of OO COBOL, but some of it certainly applies to the other
languages.
YOU'VE BEEN WARNED - SO QUIT HERE IF NOT INTERESTED !:-
--------------------------------------------------------------------------------

"Peter E.C. Dashwood" wrote:
"James J. Gavan" <jj*****@shaw.ca> wrote in message
news:3F***************@shaw.ca...

Jimmy,

your post raised a few questions.

Perhaps you could elucidate...?

How can a queue (FIFO) and a stack (LIFO) be implemented with an ORDERED or
SORTED Collection?

By definition, these collections have to be UNORDERED or UNSORTED.

I certainly hope J4 are not going to overturn the accepted concepts of queue
and stack in order to implement these collection types. But then, nothing
they did would surprise me...

Trying to trip me up with superior knowledge <G>. Mentioning Sorted Collections
was definitely a slip of the pen ! They would be a non sequitur - either
elements are added according to the default Sort or you sub-class to get your
own customized sort !

However Ordered Collections - Yes. Recall I once suggested Sequential
Collections - but you didn't like that definition. To clarify from my on-line
help :-
--------------------------------------------
Orderedcollection
An OrderedCollection manages the storage of elements in the order in which they
were added to the collection. One of the main advantages of OrderedCollections
over Arrays is that they can grow dynamically as more elements are added. You
can use OrderedCollections to implement stacks and queues.
----------------------------------------------------------

Probably the most common use being to read in data and the latest record is
added as the last element (the COBOL file sequential approach). However, there
is nothing to stop you inserting at an index position of your choice - when you
are aware of 'positioning'. The sequential approach implies "add" (at the
bottom) with no reference to an index. Naturally with callbacks you can iterate
against all the elements in the collection, plus when a condition fails you can
get out of the callback with "quitIteration".

My example was an attempt to show retrieval in its simplest form by index.
Supporting the Ordered Collection is the PRIVATE class Sequenced Collection. Be
assured there are methods to achieve the points you made, remove one (element)
or block (of elements), replace one or block etc., even "reverse" the order of
the collection. You may not be thrilled with J4, but they aren't a bunch of
dummies - and I'm sure they will cover all points. As to FIFO/LIFO - difficult
to see that I would have a use unless I ever got into Simulation.
If you haven't got what you want - James Gosling's fault. (He was born
in Calgary).
Guess he should have checked the Smalltalk hierarchy more closely before
he sat down to re-invent the wheel <G>.

Is he "REALLY using OO COBOL"...?


And you so often accuse me of mis-reading you ! How the hell did you arrive at
the above crummy statement. The two volume tome 'Core Java' is quite candid that
Java had an initial set of lists - subsequently they had to back-track and add
on features which had been missed. Not my words, but 'Core Java' published by
Sun !

Using the free or cheapest tools you can get to do a job is quite distinct
from using the best tools available to do the job.

If you can't afford to use the "right" tools for the job I don't see any
dishonesty in saying so.

Eiter way, it doesn't alter the fact that using the right tools for the job
(rather than making whatever tools you have fit the job) is a preferred
strategy.

Agreed. when originally writing I was thinking of including the words 'best
tool', but let them slip by. However that is a very subjective thing, 'best'.
For the most part I would suggest that most language users in this thread would
*prefer* to facilitate features in their 'parent' language - the one they are
most comfortable with. If all else fails, then they look at other languages,
particularly if they can find a pre canned routine/function. And it is not a
question of making 'your tools fit the job'. That is ridiculous - first call
does your language handle it, (may require a bit of research for methods
available), if not, then go the 'outside' route.
As you don't quote the OCX site, we can't judge for ourselves whether the
suggested component was appropriate or not.
Without back-tracking to look up the site - be assured it DOES have rich
features, (but from the marketing blurbs, possibly some overloading - see
below). BUT without getting into OCX the point was it was easily achievable in
COBOL - and I might add, by RE-USING a large chunk of classes which I had
already written.

Trapping a Windows or keyboard event is feasible in any language that deals
with GUI development. What makes COBOL (even OO COBOL) MORE "right" for this
than VB, C, C++, or Java?
Again your misinterpretation of what I wrote. If your preference is COBOL then
COBOL is MORE 'right' if it lets you do something easily. If your 'parent' is VB
why would you go looking at C/Java for the solution - if it is easily attainable
in VB ?

People who used the recommended OCX component wouldn't care what language
they embedded it in, and they have the same re-use advantages that you
realised by doing it in COBOL.

The point here is that components transcend languages.

(That is one reason why I personally consider them a better solution...)
As stated above, I think the general tendency will probably be language of
preference. (Don't really see too many of them invoking COBOL - do you ?). If
however, and to quote your phrase, "It's only programming', and you believe
something is more viable in another language, then go for a potpourri - the
latter approach implying you carry around more baggage with your application
because you are now adding libraries for both languages.

I suggested baggage, accompanying components could be a problem - you summarily
dismissed that as irrelevant. It can come to haunt you.. Recall Jerry's problem
with that very large file - and the attendant problem of getting extracts over
to user sites. He concluded he would drop the project and take up pig farming;
as an alternative he suggested another solution - along with the data give each
user site a free disk <G>. Difficult we know with an existing and functioning
database - but the true answer goes back to design - that file design should
probably be seriously revamped. Baggage or volumes are not something that can
be dismissed lightly. The frugality on storage when this game started some 40-45
years ago, is just as valid today, providing it doesn't impede performance.

This is a total non sequitur. What has my support of components got to do
with anything you posted? The implication seems to be that by opting for a
component based approach, I am not "REALLY" using OO COBOL.

That is unfair and uncalled for.

I don't really mind whether you think I do so or not, and it is just one of
the tools I use, but I have been using it long enough that I was able to
post an analysis and explanation of it to an ASP based Web site, at a time
when you were assuring the COBOL community that "nobody understands OO
COBOL".
A deliberate misquote on your part. Certainly I had initial difficulties getting
a handle on OO and a certain gentleman PECD is on record as saying the same
thing ! I am now a reasonably happy camper - but in view of a quote below
coming from you - ????????

(Here's the link, in case you've forgotten...:
http://www.aspalliance.com/aldotnet/...boldotnet.aspx)
The same link also contains the advice regarding "best tools for the job"
which you were so dismissive of above...)
This is the first time I read it, but I do recall a COBOLer jumping into clc and
asking about this MS .NET example. He was somewhat pissed off when I declined,
along the lines from any neighbour you might have up the road, "Hey you program
all day long. Why can't you tell me why my disk doesn't work or my screen
doesn't come on ? To describe it with some detail, I think, required knowledge
of the interplay between MS and Fujitsu NET COBOL - I was not prepared to trip
over my feet !

Not our Pete, this was a challenge not to be refused. Sorry but your quotes in
RED are not much more helpful than a Dick and Jane classic ! Spot didn't even
get a look in. To be absolutely blunt - why did you even bother ?

Let's educate the masses here who may not be familiar with COBOL :-

COBOL has four Divisions :-
-------------------------------------
IDENTIFICATION DIVISION.
ENVIRONMENT DIVISION.
DATA DIVISION.
PROCEDURE DIVISION.
------------------------------------
OK guys get cracking and start coding in COBOL. You want more ? You want jam on
it ?.

Now your understanding has advanced to the point where you know how to
handle some Windows events in it...

Keep going...eventually you may learn how to build components with it...

If that happens, maybe we can talk.
Typically snotty wind up from you - although this time you haven't attempted to
wrap your venom with a "<G>", which of course is a caveat which translates to "
Hey ! I was only kidding", if the recipient feels offended.. (That comment on
your style was initially observed by somebody else, not me !).

Maybe we can talk ? Not likely - unless you are considerably more forthcoming in
the future. While I accept that Components ARE viable, (in certain areas), - I
don't buy into your approach. Forgive me, but it appears that you've arrived at
a set of conclusions, have cast your thoughts in concrete and then made the
problems fit the solution.. In a recent message Robert used the phrase "Your
approach was myopic". I'm afraid I agree. I said a while back he is no dummy,
but his diplomatic skills definitely need a brush-up <G> Any criticism of your
perceived and 'correct' approach results in a snarl - as in your recent snotty
reply to Donald about components.

I'll mention it seeing as you've plugged your 38 years in IT. As of this June
I've been 40 years in the game, starting in my beloved Zummerzett (Somerset). I
still prefer to call it EDP. The first 12 years spent in design - that's the
stage before wee programmers started doing chicken scratches on coding pads.
Without question you can probably lose me in programming techniques - but don't
even try when it comes to design, bearing in mind I was using random access from
day one. Get the design wrong and you are on a hiding to nothing, no matter how
much clever coding you put into it..

I've certainly been bewildered over the years by your view of OO COBOL.
Clarification has only recently surfaced. Let's take some points in progression
:

1. You used an early version of Fujitsu OO COBOL which DIDN'T include
collections. You could get those coupled with a fast Sort package for about
another $1,000 (?). Surprise, surprise. I wonder how many didn't buy that
extension - so they programmed ignoring collections(lists). I assume at the time
you fitted into that category. Since then collections may be a part of later
versions of F/J OO - but I've never seen you make reference to them.

You may have your viewpoint - but it's my firm belief that lists/collections are
an inherently important feature in the OO family, and working without them is
like having one arm held behind your back. Lists have always been a part of
design, even using procedural COBOL - I recall mention of Robert designing
chained lists. Similarly, I think Michael did the same with his good ole
favourite - BASIC.

2. Some 5 or 7 of us started chatting privately about OO COBOL. Out of the blue
you came on like the converted. You had discovered Java, but found the language
'underwhelming'. You evangelized how good Java was and started distributing Java
articles to us. Huh ? Not just me, others must have felt the same - 'We are here
to discuss OO COBOL, what has Java got to do with it ?".

3. Then the Java mode went silent. Quietly you had back-tracked to Fujitsu to do
your components - sure I know you will dispute that, but the first indications
you ever gave were that you were using Fujitsu, plus the follow-up, that Fujitsu
generates your intermediate code, so you really don't have to worry too much
about OO.

4. In the C.L.C News Group I have from time to time posted sample code, on one
occasion explaining for Warren about instantiation. This particular message was
followed up by a snide remark from you about the irrelevance. "What gives ?", I
thought, "Where's his hang up ?".

5. Then I made the mistake of saying that before you even start coding you first
have to design your class hiearchies.(I bet you missed it - Roedy Green (Java)
made the same point in one of his first messages in this thread). Back you came
- No! You don't have to do it that way. "Huh ?", I'm saying again to myself.

6. Within clc we got into some friendly discussion on Components and you
attempted to spell out your String2Num - but it was never specific code and your
quotes were, "You could do it this way or this way.....". I countered by
suggesting that using a generic dialog, and creating instances of it I could
achieve the same for a series of dialogs. I posted the code specific to an
entryfield. Your response, "It may work for you, but it appears to contain too
much maintenance". I could arrive at two conclusions from that, (1) You didn't
have a bloody clue what I was describing, or (2) Having settled on Components as
your religion of choice - no deviations allowed, no prisoners taken ! My guess -
# 2,with a slight element of #1 - because you probably didn't even read it !
Note I am NOT claiming to have found the answer, just an alternative approach,
which at the moment works - as I've said before - down the road I might fall
flat on my face. In your case that couldn't possibly happen could it ?

7. I'm sorry but your mindset is often illustrated in two of your pet phrases,
"I am not convinced.....", or "I am not persuaded....". See - I do read what you
write, and although I don't go in for copious cross-reference of messages, I do
have a bloody good memory. I'll be SOL if Alzheimer's ever kicks in !

8.. Having run through the above, which conveys my bewilderment as to your
thinking, for me we now arrive at the real kicker :-
8th August
"OO for me is only important insofar as it provides the key to component
building. There are too many disadvantages in pure OO...lack of cross platform
Classes, difficulty in calling one set of classes from a language based on a

different Foundation, and so on. Simply by wrapping the objects as
components, all these disadvantages disappear and they can be dropped

anywhere. My fervour is for components, not OO, although I also believe that
OO is a superior model for programming than procedural.".


The first sentence implies OO is a non-starter - it doesn't fit your thinking.

Your list of disadvantages - "lack of cross platform classes" - I assume here
you are referring to portability - the tug-o-war our C and Java friends are not
winning (or as Howard said, 'Neither are they convincing anybody'). Certainly of
great significance if you are MARKETING components. But check those component
sites - so many modules have been written to address specific languages -
excluding COBOL ! Get into a master and servant relationship, the boss legally
owns the components you design and you aren't going to sell them anywhere - not
unless the boss wants to market them.

"Difficulty in calling one set of classes from a language based on a
different Foundation". Not in my book. I can happily invoke Java from my COBOL
compiler, coupled with COBOL classes written to specifically invoke Java. Same
goes for C with specific data types to handle inter language operability - a
point stressed in the Standard COBOL 2002. (Don't go asking for samples - I
don't use either feature. Disbelieve me if you wish - or go take a peep at
microfocus.com home page and see entries for Java, plus examples).

Note carefully, I am not rejecting components, but the argument that they are
the panacea or cure-all for the future of IT. And in the process your
conclusions have led you to dismiss major OO features. We now have OO-COBOL and
a sub-product called Dashwood-OO COBOL - just leave out the bits you think don't
matter.

You referred us to the following article :-

http://www.aboutlegacycoding.com/

I thought I'd check, as I didn't recall it had any source code. Correction it
does :-
------------------------------------------
Environment Division.

Repository.
Class OLE AS "*OLE".

Working-Storage Section.
01 S2Nobj-Name pic x(128) value "STRING2NUM.String2Num.1".
01 S2Nobj OBJECT REFERENCE OLE.

Procedure Division.

invoke OLE "CREATE-OBJECT"
using S2Nobj-Name
returning S2Nojb.

Now I can get the component to display its Properties by:

Invoke S2Nobj "ShowStuff" using strInput

Cf. the JavaScript above.

S2N.ShowStuff(strInput)

It isn't SO different, is it? (And you thought this OO stuff was hard <G>)
-------------------------------------------------------------------------------

Again something you probably missed from Roedy - he also commented that OO
developers needed to attain a skill set.

I didn't write the following - I think I got it from clc :-
-------------------------------------------------------------------------------
$set ooctrl(+P)
class-control.
word is class "$OLE$word.application".
working-storage section.
01 theWord object reference.
procedure division.
invoke word "new" returning theWord
invoke theWord "setVisible" using by value 1
invoke theWord "Quit"
invoke theWord "finalize" returning theWord
stop run.
-------------------------------------------------------

We *know* they both work, because you are going to tell me the first one
definitely works. And I'm going to tell you the second works because I compiled
and ran it.
BUT - to describe a concept - they are both as useful as teats on a bull !

Yours was an enjoyable article - But the above - ZOWEE ! That really gives me a
lot of information if I want to get into Components. Sorry, but I'm the Doubting
Thomas from the Twelve. Until I can see and touch the source - I ain't buying. I
want to see a concept liberally supported by source before I make my *own*
judgement, before I go down a different path. (If I want to follow through on
Components, what better than the Aussie academic, Brian Henderson-Sellers -
oodles of articles and diagrams, plus I can buy an armful of books from him, (he
is the author of course), on Component generalizations - but no source !).

Now here's the real problem I have with Components, because although you have
supplied the weapon you haven't illustrated the ammunition.

Let's talk something basic like dates. Please produce me a component. Well
certainly in COBOL we are already well armed with at least some eight Date
intrinsic functions which can be CALLED in traditional fashion from a COBOL
program. I don't use them all, but if I did would put them in a super class
"Dates" as individual methods. Coupled with those, there are generalized
methods, (mainly PRIVATE), required for date validation, format checking etc.
(Real bread and butter stuff).

Then we move on - return of validity checks and formatting output to return to
the invoker. Do you want ccyy/mm/dd, mm/dd/ccyy, dd/mm/ccyy, and all the other
possible permutations, or spelled out with names of months and days of week,
(both full and abbreviated). Throw in you also want spacing or hyphens or
obliques. So I live in Canada - better give me English and French. I live in the
US better give me those two plus Spanish and Portuguese. I don't need to expand
- go into the possibilities in Europe. Somewhere down the road our Asian friends
get into the act.
Do I really want a component that does all the above, even providing me with
Hindi and Urdu !

It doesn't require much visualizing to realize that this component is a MONSTER.
In response to queries from Howard Brazlee, you were adamant you don't maintain
components; you write a new one. Really ? How quickly do we arrive at Date
Component Version 1005 ?

The only sensible solution is the one that Roedy alluded to - we have to design
classes and in this case a whole hierarchy just for Dates, super doing the true
polymorphic functions, and sub-classed to handle formatting and spoken
languages .
It is conceivable that such a hierarchy may well embrace something like 200 or
more classes, (albeit some of them may be tables (arrays - for you other folks)
of literals - possibly spanning only a page of source). That still leaves us at
the monster level. But, with the right tools, sub-classes can be pulled to
create selective DLLs - and for the version I want I can be assured that it is
not carrying the baggage associated with Hindi and Urdu, or perhaps, the Slavic
group of languages.

None of the above occurs as a result of starting to code, although proto-typing
can begin. The first step is the overall hierachial design.

Would you seriously attempt to cover the whole gambit above in one of your
component structures ? Possibly you could, but to achieve it you need
sub-classing - and I'm not getting the message that is what you are doing. I
have no intention of going into detail here to spell it out, but it is clearly
obvious to me, and many others into "true" OO, all the above can be achieved via
hierarchies and sub-classing - so where does a Component have the edge ?

Without clear concise code as to the route you would take - how can I even guess
what your approach would be to the above problem - which is an absolute natural
for conventional use of OO with classes, sub-classes, plus throw in $10 words
like polymorphism and encapsulation..

I have been extremely blunt. I feel very angry at what I perceive to be
deception - possibly not your intent. Plus I am really alarmed that you are
sending the younger folks off ecstatically into space, and are they truly aware
of what the implicatins are with your 'shortcut' approach to OO ? The anger is
in the same vein as I'm bloody angry with George Dubya and his clan over the
mythical WMDs - seeing as I was one of the two Canucks who thought the U.S.
should go in - but would have preferred if Daddy Bush had done it 10 years ago.

I've stayed out of clc - largely because of your ramblings - and your dismissal
of the conventional concept of OO really pissed me off. I do not intend to go
into a tit for tat, trying to score points with a clever wordsmith. (I would
caution you however, Oscar Wilde was a clever wordsmith - and he finished up in
jail !).

If you make some valid counter-argument then I may respond - otherwsie silence.

In all my ramblings above I do NOT make claim to being an expert in OO. Mentally
I'm still back there in the UK with the big red letter "L" attached to my car -
"Learner - beware". Being down in the Antipodes I'm hardly likely to knock you
over anyway. However, if I did it in the UK and you told the judge "I was
walking along the pavement and he knocked me down", the judge would award you a
handsome sum of crisp pound notes. You'd be SOL in N. America - "Your honour, I
was walking along the pavement and he knocked me down". Judge responds, "Case
dismissed. You should have been walking on the sidewalk, not the pavement !".

As the cartoons used to say, 'THAT'S ALL FOLKS !".

Jimmy, Calgary AB


Jul 19 '05 #210

"goose" <ru**@webmail.co.za> wrote in message
news:ff**************************@posting.google.c om...
| WB <su*****@bossi.com> wrote in message
news:<vx***********************@news3.calgary.shaw .ca>...
| > Roedy Green wrote:
| > > Don't be silly. Look at any C code designed to run on many platforms.
| > > It is riddled with macros to pull off the feat.
| > >
| > > You can do a few simple things like parse the command line, open a
| > > flat file in a platform independent way, but not even the size of int
| > > is guaranteed unless you play some games with macros.
| >
| > Well, no you can't open a file in a generic way. Opening a file on a PC
| > is different than opening a file on a mainframe. You need a special
| > library to do this:
| >
| > #ifdef I370
| > #include <lcio.h>
| > #endif
| >
| > Which works with the SAS C compiler.
| >
| > Plus, all your variables and function names need to be 8 characters or
| > less, AND mixed case is ignored, so VarA and VARA are the same, thanks
| > to the mainframe linker.
| >
| > At least it was a few years ago. Things may have chnaged since then....
|
| if it does not support fopen, it is not a std-compliant C hosted
| environment.
|
| goose

That's OK, you can write for the Z80, and ignore the mainframe.

Jul 19 '05 #211
[Followups set to comp.lang.c]

WB wrote:
Who said anything about fopen? You still use fopen, but MUST have the
additional include file, which adds support for the mainframe way of
doing things.
Then Get A Better Compiler. Both C/370 and LE370 support fopen correctly,
without any need for a different include file.
The mainframe C compiler is quite happy with long var names and mixed
case.
/The/ mainframe C compiler? There are more things in heaven and earth,
Horatio, than are dreamt of in your philosophy.
I ended up writing a converter which takes the original C code, and
converts it to something that will run on the mainframe. Yuck.


Got the tee-shirt.

--
Richard Heathfield : bi****@eton.powernet.co.uk
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html
K&R answers, C books, etc: http://users.powernet.co.uk/eton
Jul 19 '05 #212
In article <bh**********@tyfon.itea.ntnu.no>,
Bent C Dalager <bc*@pvv.ntnu.no> wrote:
In article <bh**********@panix1.panix.com>, <do******@panix.com> wrote:
In article <bh**********@tyfon.itea.ntnu.no>,
Bent C Dalager <bc*@pvv.ntnu.no> wrote:
In article <bh**********@panix1.panix.com>, <do******@panix.com> wrote:

If the statements in question are the result of mere ignorance then they
lack the intentionality which is required of a lie... or am I missing
something?

You seem to be missing the fact that it was a flame. Logic and correct
use of vocabulary are non-issues in that context.


Ahhhh, *now* I understand... thanks greatly, you... you poopie-head, you!

(did I do that right?)


I certainly appreciate the effort, but you could do with some work on
the execution :-)


All right... thanks greatly, you... you poopie-head, you! Now, off to the
gallows you go!

(was that a good enough execution?)

DD

Jul 19 '05 #213
On Sat, 16 Aug 2003 10:42:52 GMT
"Harley" <de*****************@worldnet.att.net> wrote:

"goose" <ru**@webmail.co.za> wrote in message
news:ff**************************@posting.google.c om...
| WB <su*****@bossi.com> wrote in message
news:<vx***********************@news3.calgary.shaw .ca>...
| > Roedy Green wrote:
| > > Don't be silly. Look at any C code designed to run on many
| > > platforms. It is riddled with macros to pull off the feat.
| > >
| > > You can do a few simple things like parse the command line, open
| > > a flat file in a platform independent way, but not even the size
| > > of int is guaranteed unless you play some games with macros.
| >
| > Well, no you can't open a file in a generic way. Opening a file on
| > a PC is different than opening a file on a mainframe. You need a
| > special library to do this:
| >
| > #ifdef I370
| > #include <lcio.h>
| > #endif
| >
| > Which works with the SAS C compiler.
| >
| > Plus, all your variables and function names need to be 8
| > characters or less, AND mixed case is ignored, so VarA and VARA
| > are the same, thanks to the mainframe linker.
| >
| > At least it was a few years ago. Things may have chnaged since
| > then....
|
| if it does not support fopen, it is not a std-compliant C hosted
| environment.
|
| goose

That's OK, you can write for the Z80, and ignore the mainframe.


Do you have a fully conforming Java implementation for this mainframe?

Have you checked that there is not a conforming C implementation for
this mainframe? I just looked and GNU is working on an I370 port of gcc
so if nothing else is available that will be.
--
Mark Gordon
Jul 19 '05 #214
In article <20******************************@flash-gordon.me.uk>, Mark Gordon wrote:
On Fri, 15 Aug 2003 14:27:31 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
> I've not done C++,
Only a bit. (pretty much the amount needed for Delphi compability.
Basic classes and such)
> but I have done Pascal (had to use non-portable
> features)


Like?


Trapping the "STOP" key on an HP workstation. Only used this on the HP.

Separately compiled files (has that been added to the standard?) Handled
differently on the different Pascal implementations I used.


Extended Pascal standard has a module concept. Borland Pascal's (which branched
inbetween the standards) have something similar with "unit" as such.
A non-standard way to access third party libraries. Handled differently
on the different Pascal implementations I used.
Ok, you mean portable between compilers. Most dialects died out or converge
(usually towards the ansi standards). The only remaining are the Borland dialects
(BP/TP, Delphi and compatibles like FPC, VP) and the ansi standard.

While the avg BP/TP and Delphi code aren't very portable (in the same way
the avg VC++ code isn't very portable), the vast majority maps onto similar
C constructs, the rest can be considered platform dependant extensions (stuff for
dlls etc), just like e.g. VC.

I've Delphi code (compiled using FPC) on PPC and m68k systems.
Probably a number of other things that I can't remember now. I think the Delphi diverged even further, although I helped people with
Delphi rather than using it myself.


Yes. It is more C++ than C. It includes the older dialect, but is an OOP language
on top of that.

Jul 19 '05 #215
JC
Peter E. C. Dashwood <da******@enternet.co.nz> wrote in message
news:b3************************@posting.google.com ...

I have little to add.

You say you're angry. So was I. I'm tired of your cheap shots.

I suggest the best policy for both of us is to simply ignore each
others' posts.

Please don't quote or respond to my mail in your posts, and I'll adopt
the same policy with yours.


Hollerith Cards at dawn??


Jul 19 '05 #216
On Sun, 17 Aug 2003 00:20:27 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
In article <20******************************@flash-gordon.me.uk>, Mark
Gordon wrote:
On Fri, 15 Aug 2003 14:27:31 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
> I've not done C++,

Only a bit. (pretty much the amount needed for Delphi compability.
Basic classes and such)

> but I have done Pascal (had to use non-portable
> features)

Like?
Trapping the "STOP" key on an HP workstation. Only used this on the
HP.

Separately compiled files (has that been added to the standard?)
Handled differently on the different Pascal implementations I used.


Extended Pascal standard has a module concept. Borland Pascal's (which
branched inbetween the standards) have something similar with "unit"
as such.


That may be, but since *every* Pascal variant I used provided a
different mechanism it is safe to assume _at_most_ one of them was
standard.
A non-standard way to access third party libraries. Handled
differently on the different Pascal implementations I used.


Ok, you mean portable between compilers. Most dialects died out or
converge(usually towards the ansi standards). The only remaining are
the Borland dialects(BP/TP, Delphi and compatibles like FPC, VP) and
the ansi standard.


I know of a lot of embedded code written in Tectronics Pascal that will
have to be maintained for at least the next 15 years. The compiler may
not be under development, but the language is hardly dead. The same
applies to HP Pascal. 3 years ago the BSO Pascal code I wrote was also
still in use.
While the avg BP/TP and Delphi code aren't very portable (in the same
way the avg VC++ code isn't very portable), the vast majority maps
onto similar C constructs, the rest can be considered platform
dependant extensions (stuff for dlls etc), just like e.g. VC.

I've Delphi code (compiled using FPC) on PPC and m68k systems.


The problem is that if Delphi does not follow the ANSI conventions for
seperate compilation of modules then you can't easily include an ANSI
Pascal module in a Delphi project. You can, on the other hand, include
an ANSI C++ module in a VC++ project.
Probably a number of other things that I can't remember now.

I think the Delphi diverged even further, although I helped people
with Delphi rather than using it myself.


Yes. It is more C++ than C. It includes the older dialect, but is an
OOP language on top of that.


I know what Delphi is, I just don't know how far the ANSI Pascal
standard went, since I *never* used a Pascal compiler claiming ANSI
conformance despite using several Pascal variants.

So my experience of Pascal is one of a heavily extended language with
major incompatibilities between variants. If there are *now* ANSI Pascal
compilers for most targets and a way to easily mix Delphi and ANSI
Pascal then things have changed.
--
Mark Gordon
Jul 19 '05 #217
In article <20******************************@flash-gordon.me.uk>, Mark Gordon wrote:
On Sun, 17 Aug 2003 00:20:27 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
>
> Trapping the "STOP" key on an HP workstation. Only used this on the
> HP.
>
> Separately compiled files (has that been added to the standard?)
> Handled differently on the different Pascal implementations I used.
Extended Pascal standard has a module concept. Borland Pascal's (which
branched inbetween the standards) have something similar with "unit"
as such.


That may be, but since *every* Pascal variant I used provided a
different mechanism it is safe to assume _at_most_ one of them was
standard.


That's possible.
> A non-standard way to access third party libraries. Handled
> differently on the different Pascal implementations I used.


Ok, you mean portable between compilers. Most dialects died out or
converge(usually towards the ansi standards). The only remaining are
the Borland dialects(BP/TP, Delphi and compatibles like FPC, VP) and
the ansi standard.


I know of a lot of embedded code written in Tectronics Pascal that will
have to be maintained for at least the next 15 years. The compiler may
not be under development, but the language is hardly dead. The same
applies to HP Pascal. 3 years ago the BSO Pascal code I wrote was also
still in use.


Sure, but that is the case with all kinds of embedded C's too. Except for
x86, the only other processors I programmed for ( a 68k variant and Hitachi
8051 variant) were also not fully ansi C conforming.

The fact is that the bulk of new pascal development is either in ansi or
borland dialects. (a loose guestimate is ratio 1:100). There are deviate
dialects in circulation, specially in the embedded world, but which older
language doesn't have that?
While the avg BP/TP and Delphi code aren't very portable (in the same
way the avg VC++ code isn't very portable), the vast majority maps
onto similar C constructs, the rest can be considered platform
dependant extensions (stuff for dlls etc), just like e.g. VC.

I've Delphi code (compiled using FPC) on PPC and m68k systems.


The problem is that if Delphi does not follow the ANSI conventions for
seperate compilation of modules then you can't easily include an ANSI
Pascal module in a Delphi project.


Yes, and that makes the ansi world and the Borland world pretty much separate
communities (read nearly two languages). There are some that try to bridge the
gap like GNU Pascal (a compiler that supports both BP an ansi, and is working on later
borland dialects like Delphi)

But in general the picture you sketch is no problem since the Delphi
community, codebase etc is at least a magnitude 100 larger than the
ansi-pascal community.
You can, on the other hand, include
an ANSI C++ module in a VC++ project.
You can't use it in C51 :_)

But the real trouble is a C centric view, that puts the standard
automatically in the middle. A standard is a tool to obtain some
unification, not a purpose in itself. Sometimes it works (as in the C case),
sometimes it doesn't succeed in unifying the separate dialects (as in
Pascal)

The situation you mention above is simply no problem. If one uses Delphi,
on can access the largest codebase by far (which is Borland dialects).
> Probably a number of other things that I can't remember now.

> I think the Delphi diverged even further, although I helped people
> with Delphi rather than using it myself.


Yes. It is more C++ than C. It includes the older dialect, but is an
OOP language on top of that.


I know what Delphi is, I just don't know how far the ANSI Pascal
standard went, since I *never* used a Pascal compiler claiming ANSI
conformance despite using several Pascal variants.


There are two standards and one draft that was never turned into a standard.

The draft being OOP orientated (Object Pascal) in nature and sponsored by
Apple IIRC, was used (but not strictly followed) by Borland too, to create Delphi.
So my experience of Pascal is one of a heavily extended language with
major incompatibilities between variants.
I used three C/C++ compilers. VC++, Keile C51 and gcc on FreeBSD. All codebases were
pretty uncompilable on the compilers they were not written for, because of extensions,
libraries etc.

IOW I don't see a fundamental difference. True, the number of language
variants, and the average number of extensions etc are a bit less in the C
case, but that is more the language popularity.

With both languages carefully crafted codebases are somewhat portable, but
in both cases the subset is too limiting. (ansi C+posix is already a bit
better, but that is more than a mere language standard, and it still sucks
on Windows)
If there are *now* ANSI Pascal compilers for most targets and a way to
easily mix Delphi and ANSI Pascal then things have changed.


Delphi and Ansi Pascal are not really mixable, except on the most minimal
level. Pretty much like VC++ and Keile C51.
Jul 19 '05 #218
Please STOP crossposting to comp.lang.pascal.ansi-iso. This thread
has nothing to do with that forum.

Thank you.
Jul 19 '05 #219
In article <20******************************@flash-gordon.me.uk>, Mark Gordon wrote:
On Sun, 17 Aug 2003 14:03:37 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
> compiler may not be under development, but the language is hardly
> dead. The same applies to HP Pascal. 3 years ago the BSO Pascal code
> I wrote was also still in use.


Sure, but that is the case with all kinds of embedded C's too.


The HP Pascal I mentioned was *not* an embedded Pascal.


Neither is VC++, or BC.
Anyway, as far as I'm aware all C implementations support the same
mechanism for separate compilation of modules so you don't have the same
problem.
Hmm. Prototypes etc? Afaik K&R had no modules or prototypes at all, the
separate compilation were just externals, and parameters didn't even had
to match. (iow everything happened on linker, not langauge level)
Except
for x86, the only other processors I programmed for ( a 68k variant
and Hitachi 8051 variant) were also not fully ansi C conforming.


The Texas Instruments implementation of C for the TMS320C1x/2X/5X was a
fully conforming implementation and even came with a copy of K&R2 as
part of the documentation set. This was at the same time I was dealing
with all those different Pascals...


K&R is no standard. ANSI is.
The fact is that the bulk of new pascal development is either in ansi
or borland dialects. (a loose guestimate is ratio 1:100). There are
deviate dialects in circulation, specially in the embedded world, but ~>> which older language doesn't have that?
Pascal is IMHO worse because the original definition did not include
support for separate compilation of modules. For this reason (and
probably others) K&R C is more likely to compile on a modern compiler
that supports ANSI C than Pascal written for any of the compilers I
mentioned.
That's not true afaik (but I'm no expert, just from the old BSD days). Most
K&R code mismatches or omits parameters between declaration and compilation.
This was only fixed with the formal prototypes in some later standard.
This is irrespective of whether it was embedded or
non-embedded C or Pascal.
Embedded versions are often simplified, and therefore often base
on older versions. Judging the state of Pascal by those is a bit odd.
>> VC.
>> I've Delphi code (compiled using FPC) on PPC and m68k systems.
>
> The problem is that if Delphi does not follow the ANSI conventions
> for seperate compilation of modules then you can't easily include an
> ANSI Pascal module in a Delphi project.

~
Yes, and that makes the ansi world and the Borland world pretty much
separate communities (read nearly two languages).


C & C++ are two distinct languages, yet you can link them relatively
easily...


C++ nearly includes C, even though they are formally separate languages.

Link compability is a compiler/linker thing anyway though, and has not much
to do with the language. At least if the language wants to remain portable :-)
You probably mean that C++ and C FROM THE SAME VENDOR reasonably link well.

OTOH that is not a problem. Most Pascal compilers also link to C. They
probably also can link to eachother, only on a deeper level (directly
passing file handles instead of file descriptors etc)
BP an ansi, and is working on later borland dialects like Delphi)

But in general the picture you sketch is no problem since the Delphi
community, codebase etc is at least a magnitude 100 larger than the
ansi-pascal community.


So ANSI Pascal is largely irrelevant to the Pascal community ;-)


In practice yes. But the Borland group, while it far outnumbers ANSI, is
x86 (and often even x86/win32) centric. So if you go outside x86, you'll
encounter quite a lot of ansi.
> an ANSI C++ module in a VC++ project.


You can't use it in C51 :_)


I bet you can use a lot of conforming C code not written for the 8051 on
an 8051, since I bet there is a conforming implementation.


Maybe, but that would really strain those 256 bytes of memory.
But the real trouble is a C centric view, that puts the standard
automatically in the middle. A standard is a tool to obtain some
unification, not a purpose in itself. Sometimes it works (as in the C
case), sometimes it doesn't succeed in unifying the separate dialects
(as in Pascal)


I don't know about anyone else here, but I use the standard as a way of
ensuring that my code will run on multiple platforms with minimal
difficulty (I do have to use platform specific extensions in limited
areas) and not as an end in itself.


Yes. It is a very limited help of trying to verify that. But in practice,
the standard is often not enough to build an average application.
The situation you mention above is simply no problem. If one uses
Delphi, on can access the largest codebase by far (which is Borland
dialects).


In C you can access most of the C code base whether it was originally
targeted for embedded or hosted environments. Obviously on an embedded
environment without a file system you can't use file oriented libraries,
but you could use some MD5 code written for a PC.


Quite a lot of code isn't very conforming (I can remember having to fix
nearly every program when I got an Alpha machine)

But except for that (and those original K&R code), you are somewhat right,
the problem is mroe that a fully standard C program is often trivial, and
no real app.

Something like that is nice to show to students, but not something for IRL.

But that doesn't mean (and I don't mean to imply) that the standard is
useless, on the contrary, I think the C situation *is* better.
I do think however the magnitude of the differences (specially when related
to standards) is severely overrated.
> I know what Delphi is, I just don't know how far the ANSI Pascal
> standard went, since I *never* used a Pascal compiler claiming ANSI
> conformance despite using several Pascal variants.


There are two standards and one draft that was never turned into a
standard.


So do the majority of modern Pascal implementations support ANSI
standard Pascal?


Either that or Borland. Borland is proprietary, but so dominant that smaller
vendors (TMT,VP) follow it, and it also has following in the open source
community.
I used three C/C++ compilers. VC++, Keile C51 and gcc on FreeBSD. All
codebases were pretty uncompilable on the compilers they were not
written for, because of extensions, libraries etc.


Both VC++ and gcc can compile ANSI standard C (I don't know Keile C51)
so as long as you keep your implementation specific code isolated


Sure, but that was not what I said. I said the compilers can't compile
an average program from the other.
(sometime I always try to do) then the bulk of your code will compile
and run correctly on both.
And be fairly trivial.
IOW I don't see a fundamental difference. True, the number of language
variants, and the average number of extensions etc are a bit less in
the C case, but that is more the language popularity.


One major difference, almost all C compilers produced for a long time
have been able to compile ANSI standard C, I don't think the same can be
said about Pascal.


No, and probably never will. I don't dispute that. I'm just saying it is
overrated.
With both languages carefully crafted codebases are somewhat portable,
but in both cases the subset is too limiting.


My experience was that *no* Pascal module compile on any compiler other
than the one it was written on because it would fail as soon as you
reached the line indicating it was a module rather than a program or as
soon as it referenced any other module, whichever came first.


Well, that is not my experience. The borland versions (generations is a
better world) are backward compatible till 1985, and ansi pascal is pretty
close to compability with even J&W pascal.
(ansi C+posix is already
a bit better, but that is more than a mere language standard, and it
still sucks on Windows)


Windows has Posix compatibility layers, depending on what you want.


Sure. But that is not ansi isn't it?
also has GTK available if you want a common graphics handling code
between Windows and Unix.


(GTK on windows sucks, but)

GTK is also not exclusive to C. Actually the RAD of the pascal compiler I use
uses GTK on Unix platforms (but native winapi on win32)

Delphi does something similar, but uses QT.
> If there are *now* ANSI Pascal compilers for most targets and a way
> to easily mix Delphi and ANSI Pascal then things have changed.


Delphi and Ansi Pascal are not really mixable, except on the most
minimal level. Pretty much like VC++ and Keile C51.


As I say, I don't know Kiele, but it is possible to have code that can
be compiled by both VC++ and gcc. Look at Berkeley DB for one example.


I can also craft Pascal code accepted by (nearly?) every compiler, so what's
the difference?

(well, there actually is. The string handling of that code
will be clumsy. char ident[x] based like C, and that is not how you use
strings usually under Pascal)

Jul 19 '05 #220
In article <20******************************@flash-gordon.me.uk>,
Mark Gordon <sp******@flash-gordon.me.uk> wrote:
On Tue, 12 Aug 2003 07:57:18 -0400
Joe Zitzelberger <jo**************@nospam.com> wrote:
In article <ff**************************@posting.google.com >,
ru**@webmail.co.za (goose) wrote:
and yet creating a std C program would not only get you that, it
would also get you a fairly snappy application *and* leave you open
in the future to be able to support those people who have machines
that are not capable of running java (certain designer
palmtop-types) to *also* interface with the campus machines.

java doesn't *buy* you anything extra in terms of portability.
The only relatively *portable* way I can think of is when writing
applets for web-pages (note: /relatively/). as long as the browser
has a java runtime environment, of course.

Java does have its advantages. Portability isn't one of them.
???Huh???

Which sort of Java isn't portable? I get binary compatibility on all
desktop/server/enterprise machines and many embedded as well. If that
fails (it never has) I get source level compatibilily (the compiler is
written in java after all...) across all the platforms.

Now it might not make any sense for me to try and open a dialog box on
a stoplight, and the stoplight manufacturer might well leave those
libraries out, but that hardly makes it non-portable.


What if the JVM has not been ported to the processor you are using? Can
you find a JVM for any of the following processors?

Z80
6502
8051
TMS320Cxx

just to name 4 families of processor off the top of my head all of which
are used in *current* projects.

Even if you ported the JVM it would run like a snail on tranquilisers.


I used the Pascal byte code system, precursor and work-alike to modern
Java, on the 6502 and Z80. And yes, it did run like a snail on
tranquilisers. For that matter, so did highly tuned assembler and
compiled C.

Anyone who wants to license a reference implementation from Sun can do
so for $150,000, IIRC, and write a very think hardware layer to get Java
on thier chip/os.

There are also clean room clones available...
Also we had to upgrade from SCO 5.0.5 to SCO 5.0.6, something I'm told
was painful, on several customer sites in order to be able to run a
specific Java application. Java 1.4 is not available for earlier version
of SCO and was not available at that time for AIX, which another of our
customers uses.


I've not played with SCO for a while (1997 to be exact), but a minor
release ought not prevent you from running Java 1.4? What happened when
you installed it on 5.0.5?

All of this talks of applications, not applets which were a cute, but
useless toy.


As have the people talking about processors where the JVM is not
available.


A brief search of Google shows at least one JVM product targeted at the
6502, specifically C64, from mts.com. And a JIT compiler/runtime
targeted at Z80/6502 from Stanford. Also a hardware boost for th
6502/65816 family from Fawcett.

That is the first results page. I'm not sure what any of the statuses
are.

Have you ever actually tried porting an application to another
hardware/os using std C? It is not just a recompile, there are plenty
of issues the original programmers must have planned for -- and they
usually don't.


One program I wrote in C for an embedded system was debugged by me on a
Silocon Graphics workstation by running the code natively (not in an
emulator or simulator). All I had to do was replace the two functions
that talked to the hardware with one function to read test data and
another to interface to a graphics library and display the results.
The debugged code then ran perfectly on the embedded system without any
further changes.

Some companies actually hold code reviews to ensure that code is well
written, and obstreperous abstrads like be *do* reject code and insist
on it being rewritten if the job has not been done properly.


I think that is a great approach, but I also think it is a rarity to
have such reviews held. In my experience, "portability" usually means
running on all versions of Windows, or, at a streach, all versions of
Windows and some Linux. Something as trivial as endianess (because the
whole world really runs on Intel, doesn't it) is almost universally not
considered.
Jul 19 '05 #221
On Sun, 17 Aug 2003 22:53:38 GMT, "James J. Gavan" <jj*****@shaw.ca>
wrote:
If we don't hear from each language, may we assume that particular language
doesn't have all the necessary tools ???? <G>


No, but you could assume not everyone wants to play your game.

--
Al Balmer
Balmer Consulting
re************************@att.net
Jul 19 '05 #222


Alan Balmer wrote:
On Sun, 17 Aug 2003 22:53:38 GMT, "James J. Gavan" <jj*****@shaw.ca>
wrote:
If we don't hear from each language, may we assume that particular language
doesn't have all the necessary tools ???? <G>


No, but you could assume not everyone wants to play your game.


Why assume a game was being played ? It was a legit question, what techniques do
you use within your language to handle such a problem - I'm not 'voting' for one
outcome - a genuine enquiry as to how OO programmers would handle such, from the
features they have in their respective languages.

Please don't read into the message an alternative meaning. (If I had a hidden
agenda, be damned sure I would have asked the question in a very different
fashion).

Jimmy
Jul 19 '05 #223
On Mon, 18 Aug 2003 08:23:27 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
In article <20******************************@flash-gordon.me.uk>, Mark
Gordon wrote:
On Sun, 17 Aug 2003 14:03:37 +0000 (UTC)
Marco van de Voort <ma****@stack.nl> wrote:
> compiler may not be under development, but the language is hardly
> dead. The same applies to HP Pascal. 3 years ago the BSO Pascal
> code I wrote was also still in use.

Sure, but that is the case with all kinds of embedded C's too.


The HP Pascal I mentioned was *not* an embedded Pascal.


Neither is VC++, or BC.


Both VC++ nor BC (if you mean what I think you mean) *can* compile ANSI
C, so they support my point.
Anyway, as far as I'm aware all C implementations support the same
mechanism for separate compilation of modules so you don't have the
same problem.


Hmm. Prototypes etc? Afaik K&R had no modules or prototypes at all,
the separate compilation were just externals, and parameters didn't
even had to match. (iow everything happened on linker, not langauge
level)


K&R did not have prototypes, however the specification *did* allow for
the separate compilation of modules. That was standardised by C89.
However the language *did* specify that you could compile separate
modules (even having an extern keyword for specifying that an object
was external). It just does not specify how you invoke *any* of the
tools.

The original definition of Pascal did *not* specify any support separate
modules and provided no mechanism for specifying that an object was
external, so any support for separate compilation of modules was an
extension.
Except
for x86, the only other processors I programmed for ( a 68k variant
and Hitachi 8051 variant) were also not fully ansi C conforming.


The Texas Instruments implementation of C for the TMS320C1x/2X/5X
was a fully conforming implementation and even came with a copy of
K&R2 as part of the documentation set. This was at the same time I
was dealing with all those different Pascals...


K&R is no standard. ANSI is.


The second edition (K&R2) is for ANSI standard C and is one of the most
commonly used and recommended reference books for it.
The fact is that the bulk of new pascal development is either in
ansi or borland dialects. (a loose guestimate is ratio 1:100).
There are deviate dialects in circulation, specially in the
embedded world, but
which older language doesn't have that?


Pascal is IMHO worse because the original definition did not include
support for separate compilation of modules. For this reason (and
probably others) K&R C is more likely to compile on a modern
compiler that supports ANSI C than Pascal written for any of the
compilers I mentioned.


That's not true afaik (but I'm no expert, just from the old BSD days).
Most K&R code mismatches or omits parameters between declaration and
compilation. This was only fixed with the formal prototypes in some
later standard.


Prototypes were added with the ansi standard, but
extern int foo();
extern int bar;
were valid ways of declaring function foo() and variable bar without
defining them, thus allowing them to be defined in another module. The
original definition of Pascal had no such feature.
This is irrespective of whether it was embedded or
non-embedded C or Pascal.


Embedded versions are often simplified, and therefore often base
on older versions. Judging the state of Pascal by those is a bit odd.


All the embedded C compilers I have used implemented the full
specification for a free-standing implementation. All of the Pascals,
including those that were *not* targeting embedded systems, used
non-standard, non-portable mechanisms for allowing access to symbols
defined in separately compiled modules.
>> VC.
>> I've Delphi code (compiled using FPC) on PPC and m68k systems.
>
> The problem is that if Delphi does not follow the ANSI
> conventions for seperate compilation of modules then you can't
> easily include an ANSI Pascal module in a Delphi project.
~
Yes, and that makes the ansi world and the Borland world pretty
much separate communities (read nearly two languages).


C & C++ are two distinct languages, yet you can link them relatively
easily...


C++ nearly includes C, even though they are formally separate
languages.


No, C++ will report diagnostics and probably abort compilation when
trying to compile a *lot* of ANSI standard C. For example, if the result
of a malloc call is not cast C++ will reject it where as not casting it
is the recommended approach in C.
Link compability is a compiler/linker thing anyway though, and has not
much to do with the language. At least if the language wants to remain
portable :-) You probably mean that C++ and C FROM THE SAME VENDOR
reasonably link well.
The C++ standard explicitly defines some of how the linking of C and C++
is to be handled, such as the 'extern "C"' stuff you see in a lot of
headers. The C standard cooperates to the extend of guaranteeing a way
of identifying at compile time whether a file is being compiled as C or
C++ to allow you to share header files.
OTOH that is not a problem. Most Pascal compilers also link to C. They
probably also can link to eachother, only on a deeper level (directly
passing file handles instead of file descriptors etc)
However, there is no way defined to specify that external objects are
external C objects, unlike with C++. There is also no way to include a C
header file from a Pascal source file.
BP an ansi, and is working on later borland dialects like Delphi)

But in general the picture you sketch is no problem since the
Delphi community, codebase etc is at least a magnitude 100 larger
than the ansi-pascal community.


So ANSI Pascal is largely irrelevant to the Pascal community ;-)


In practice yes. But the Borland group, while it far outnumbers ANSI,
is x86 (and often even x86/win32) centric. So if you go outside x86,
you'll encounter quite a lot of ansi.


Whatever processor you will find a lot of ANSI standard C, even for the
x86/DOS/Win world.
> an ANSI C++ module in a VC++ project.

You can't use it in C51 :_)


I bet you can use a lot of conforming C code not written for the
8051 on an 8051, since I bet there is a conforming implementation.


Maybe, but that would really strain those 256 bytes of memory.


I just checked, and version 7 of C51 claims ANSI conformance.
But the real trouble is a C centric view, that puts the standard
automatically in the middle. A standard is a tool to obtain some
unification, not a purpose in itself. Sometimes it works (as in the
C case), sometimes it doesn't succeed in unifying the separate
dialects (as in Pascal)


I don't know about anyone else here, but I use the standard as a way
of ensuring that my code will run on multiple platforms with minimal
difficulty (I do have to use platform specific extensions in limited
areas) and not as an end in itself.


Yes. It is a very limited help of trying to verify that. But in
practice, the standard is often not enough to build an average
application.


I've written large C applications for real world problems as part of my
job where only a small amount of isolated code was implementation
dependant.

As part of my current job I work on an application with a few hundred
thousand lines of code which are slowly being migrated from K&R C to
ANSI standard C whilst also being maintained and further developments
done. The bulk of the implementation specifics are in a separate library
allowing *all* of the business logic to be written in either K&R C (for
the old stuff) or standard C89 for the new stuff. So that is probably a
few hundred thousand lines of C that will eventually all be written to
C98 and a *much* smaller amount of implementation specific code which
is steadily shrinking as we do a progressive rewrite.
The situation you mention above is simply no problem. If one uses
Delphi, on can access the largest codebase by far (which is Borland
dialects).


In C you can access most of the C code base whether it was
originally targeted for embedded or hosted environments. Obviously
on an embedded environment without a file system you can't use file
oriented libraries, but you could use some MD5 code written for a
PC.


Quite a lot of code isn't very conforming (I can remember having to
fix nearly every program when I got an Alpha machine)

But except for that (and those original K&R code), you are somewhat
right, the problem is mroe that a fully standard C program is often
trivial, and no real app.

Something like that is nice to show to students, but not something for
IRL.


As I say, I work on large *real* applications using standard C for 90%
(or more) of my work. Therefor it is useful for *real* work on *large*
projects.
But that doesn't mean (and I don't mean to imply) that the standard is
useless, on the contrary, I think the C situation *is* better.
I do think however the magnitude of the differences (specially when
related to standards) is severely overrated.
> I know what Delphi is, I just don't know how far the ANSI Pascal
> standard went, since I *never* used a Pascal compiler claiming
> ANSI conformance despite using several Pascal variants.

There are two standards and one draft that was never turned into a
standard.


So do the majority of modern Pascal implementations support ANSI
standard Pascal?


Either that or Borland. Borland is proprietary, but so dominant that
smaller vendors (TMT,VP) follow it, and it also has following in the
open source community.


So, you still can't share code with embedded systems. I can and *have*
done so for *real* work on *complex* applications.
I used three C/C++ compilers. VC++, Keile C51 and gcc on FreeBSD.
All codebases were pretty uncompilable on the compilers they were
not written for, because of extensions, libraries etc.


Both VC++ and gcc can compile ANSI standard C (I don't know Keile
C51) so as long as you keep your implementation specific code
isolated


Sure, but that was not what I said. I said the compilers can't compile
an average program from the other.


Berkeley DB (the one used to drive the Amazon web site, amongst others)
is built using gcc for Linux and VC++ for Windows. Maybe that is a well
written program rather than an average program.

The application I work on, several hundred thousand lines of code, used
to be built for Windows using VC++, for Linux using gcc and for HPUX,
SCO, AIX and Solaris using the standard compilers from the relevant OSs.
This is using the *same* source files in all cases. It is also software
that my company earns millions from annually.

I changed this to standardising on gcc because I wanted to and gcc is
available for all the targets we want to support.
(sometime I always try to do) then the bulk of your code will
compile and run correctly on both.


And be fairly trivial.


Do you think an applications several hundred thousand lines long is
trivial? Or Berkeley DB which is used to power a lot of major web sites?
Or software for performing real time analysis of video data which I
debugged by running natively on a workstation before running
*unchanged* in an embedded system?
IOW I don't see a fundamental difference. True, the number of
language variants, and the average number of extensions etc are a
bit less in the C case, but that is more the language popularity.


One major difference, almost all C compilers produced for a long
time have been able to compile ANSI standard C, I don't think the
same can be said about Pascal.


No, and probably never will. I don't dispute that. I'm just saying it
is overrated.


Well, standard C is very useful for us C developers for writing *major*
applications.

If, on the other hand, I wand to quickly knock up a toy GUI applications
I'll reach for VC++, Delphi or similar and sod portability.
With both languages carefully crafted codebases are somewhat
portable, but in both cases the subset is too limiting.


My experience was that *no* Pascal module compile on any compiler
other than the one it was written on because it would fail as soon
as you reached the line indicating it was a module rather than a
program or as soon as it referenced any other module, whichever came
first.


Well, that is not my experience. The borland versions (generations is
a better world) are backward compatible till 1985, and ansi pascal is
pretty close to compability with even J&W pascal.


I know that Borland Pascal is vastly different from any other Pascal
I've used (I forgot to mention having used Turbo Pascal 5.5 and Borland
Pascal).

So you have one manufacturer providing backwards compatibility with it's
own products, other trying to hit this moving target and some compilers
that aren't quite ansi compliant. Whereas for C we have almost every
compiler written since the early 90s supporting standard C.
(ansi C+posix is already
a bit better, but that is more than a mere language standard, and
it still sucks on Windows)


Windows has Posix compatibility layers, depending on what you want.


Sure. But that is not ansi isn't it?


So you keep your hooks to it isolated in one module. Then you have a few
percent of your code to rewrite and the bulk of it standard. I know, I
*do* this on *large* projects.
also has GTK available if you want a common graphics handling code
between Windows and Unix.


(GTK on windows sucks, but)

GTK is also not exclusive to C. Actually the RAD of the pascal
compiler I use uses GTK on Unix platforms (but native winapi on win32)

Delphi does something similar, but uses QT.


I didn't say that you could not interface to cross-platform graphics
libraries from Pascal. I was pointing out that the libraries not being
part of C was not a major problem.
> If there are *now* ANSI Pascal compilers for most targets and a
> way to easily mix Delphi and ANSI Pascal then things have
> changed.
Delphi and Ansi Pascal are not really mixable, except on the most
minimal level. Pretty much like VC++ and Keile C51.


As I say, I don't know Kiele, but it is possible to have code that
can be compiled by both VC++ and gcc. Look at Berkeley DB for one
example.


I can also craft Pascal code accepted by (nearly?) every compiler, so
what's the difference?

(well, there actually is. The string handling of that code
will be clumsy. char ident[x] based like C, and that is not how you
use strings usually under Pascal)


So you have to use a sub-set of Pascal where as I have the whole of C
available to me.

I also have implementation specific extensions for the small percentage
of the code (normally under 10% on the applications I've dealt with)
that requires it.
--
Mark Gordon
Jul 19 '05 #224
On Mon, 18 Aug 2003 10:52:08 -0400
Joe Zitzelberger <jo**************@nospam.com> wrote:
In article <20******************************@flash-gordon.me.uk>,
Mark Gordon <sp******@flash-gordon.me.uk> wrote:
On Tue, 12 Aug 2003 07:57:18 -0400
Joe Zitzelberger <jo**************@nospam.com> wrote:
In article <ff**************************@posting.google.com >,
ru**@webmail.co.za (goose) wrote:

> and yet creating a std C program would not only get you that, it
> would also get you a fairly snappy application *and* leave you
> open in the future to be able to support those people who have
> machines that are not capable of running java (certain designer
> palmtop-types) to *also* interface with the campus machines.
>
> java doesn't *buy* you anything extra in terms of portability.
> The only relatively *portable* way I can think of is when
> writing applets for web-pages (note: /relatively/). as long as
> the browser has a java runtime environment, of course.
>
> Java does have its advantages. Portability isn't one of them.

???Huh???

Which sort of Java isn't portable? I get binary compatibility on
all desktop/server/enterprise machines and many embedded as well.
If that fails (it never has) I get source level compatibilily (the
compiler is written in java after all...) across all the
platforms.

Now it might not make any sense for me to try and open a dialog
box on a stoplight, and the stoplight manufacturer might well
leave those libraries out, but that hardly makes it non-portable.
What if the JVM has not been ported to the processor you are using?
Can you find a JVM for any of the following processors?

Z80
6502
8051
TMS320Cxx

just to name 4 families of processor off the top of my head all of
which are used in *current* projects.

Even if you ported the JVM it would run like a snail on
tranquilisers.


I used the Pascal byte code system, precursor and work-alike to modern
Java, on the 6502 and Z80. And yes, it did run like a snail on
tranquilisers. For that matter, so did highly tuned assembler and
compiled C.

Anyone who wants to license a reference implementation from Sun can do
so for $150,000, IIRC, and write a very think hardware layer to get
Java on thier chip/os.


That is a lot of money to spend compared to the cost of a C
implementation and it probably still won't help you for all the target
I mentioned.
There are also clean room clones available...
Not for all the targets I mentioned.
Also we had to upgrade from SCO 5.0.5 to SCO 5.0.6, something I'm
told was painful, on several customer sites in order to be able to
run a specific Java application. Java 1.4 is not available for
earlier version of SCO and was not available at that time for AIX,
which another of our customers uses.


I've not played with SCO for a while (1997 to be exact), but a minor
release ought not prevent you from running Java 1.4? What happened
when you installed it on 5.0.5?


I can't remember the specifics, but the JVM does not run.
All of this talks of applications, not applets which were a cute,
but useless toy.


As have the people talking about processors where the JVM is not
available.


A brief search of Google shows at least one JVM product targeted at
the 6502, specifically C64, from mts.com. And a JIT compiler/runtime
targeted at Z80/6502 from Stanford. Also a hardware boost for th
6502/65816 family from Fawcett.


So you found two of the 4 targets I mentioned. I found one that targets
the 8051. However, I can't see one for the TMS320C25 and I did look.
That is the first results page. I'm not sure what any of the statuses
are.
Taking 20K for the JVM when you have a limited memory space could be a
problem. It would not run on the systems where we only had 8K of ROM and
2K of RAM.
Have you ever actually tried porting an application to another
hardware/os using std C? It is not just a recompile, there are
plenty of issues the original programmers must have planned for --
and they usually don't.


One program I wrote in C for an embedded system was debugged by me
on a Silocon Graphics workstation by running the code natively (not
in an emulator or simulator). All I had to do was replace the two
functions that talked to the hardware with one function to read test
data and another to interface to a graphics library and display the
results. The debugged code then ran perfectly on the embedded system
without any further changes.

Some companies actually hold code reviews to ensure that code is
well written, and obstreperous abstrads like be *do* reject code and
insist on it being rewritten if the job has not been done properly.


I think that is a great approach, but I also think it is a rarity to
have such reviews held.


Real reviews are mandated on projects for the military and for safety
critical projects.
In my experience, "portability" usually means
running on all versions of Windows, or, at a streach, all versions of
Windows and some Linux.
As I've posted else where, my experience includes debugging and running
code natively on a Silicon Graphics Workstation then running it
*unchanged* on an embedded system. There was about 20 lines of C and 40
lines of assembler for the interfacing on the embedded system and a few
thousand lines of C to do the real work. The embedded processor was a
TMS320C25 which has a "byte" size of 16 bits.
Something as trivial as endianess (because
the whole world really runs on Intel, doesn't it) is almost
universally not considered.


The application I am currently working on is several hundred thousand
lines of code and it runs on both big and little endian machines and
both Windows and a variety of Unix derivatives.

For the binary files the original author just chose one endianness and
wrote some code to explicitly read the files as bytewise as that
endianness.
--
Mark Gordon
Jul 19 '05 #225
In article <20******************************@flash-gordon.me.uk>, Mark Gordon wrote:

[snipping just a few points]
Hmm. Prototypes etc? Afaik K&R had no modules or prototypes at all,
the separate compilation were just externals, and parameters didn't
even had to match. (iow everything happened on linker, not langauge
level)
K&R did not have prototypes, however the specification *did* allow for
the separate compilation of modules. That was standardised by C89.
However the language *did* specify that you could compile separate
modules (even having an extern keyword for specifying that an object
was external). It just does not specify how you invoke *any* of the
tools.


If that is separate compilation of _modules_, then you are right. IMHO it
isn't. Since there is no interaction between the modules whatsoever on
compiler level, only a simple linker trick, and a modifier.

One is trying to get several programs into one binary, not making one
program consisting out of several modules.

But maybe my concept of a module is a bit different than yours. Blame it
on my Modula2 years.

Even with prototypes I've a bit of a problem with it, but maybe I'm not
understanding prototypes right. At least there is some interaction.
The original definition of Pascal did *not* specify any support separate
modules and provided no mechanism for specifying that an object was
external, so any support for separate compilation of modules was an
extension.
It still doesn't I think. "external", "mangling", "calling conventions" etc
are all beyond the scope of the language, as being compiler specific.
> The Texas Instruments implementation of C for the TMS320C1x/2X/5X
> was a fully conforming implementation and even came with a copy of
> K&R2 as part of the documentation set. This was at the same time I
> was dealing with all those different Pascals...


K&R is no standard. ANSI is.


The second edition (K&R2) is for ANSI standard C and is one of the most
commonly used and recommended reference books for it.


IIRC that (K&R2) is quite late isn't it? 1989 or so?

The extended standard dates from only a year later (1990), which is already
years after Borland introduced units. (and TP/BP was actually in its zenith
during those years, 1987-1992)

So the only thing that remains is that you used non standards pascals,
while you used standard C's ?

That sounds as a cheap hack from a advocatist, but it is actually the usual
pattern in pascal vs C discussions.

C was simply lucky that it had a foundation in Unix, there is not much more
to say.
> compiler that supports ANSI C than Pascal written for any of the
> compilers I mentioned.


That's not true afaik (but I'm no expert, just from the old BSD days).
Most K&R code mismatches or omits parameters between declaration and
compilation. This was only fixed with the formal prototypes in some
later standard.


Prototypes were added with the ansi standard, but
extern int foo();
extern int bar;
were valid ways of declaring function foo() and variable bar without


Declaring them, as in the other module could check them?
> This is irrespective of whether it was embedded or
> non-embedded C or Pascal.


Embedded versions are often simplified, and therefore often base
on older versions. Judging the state of Pascal by those is a bit odd.


All the embedded C compilers I have used implemented the full
specification for a free-standing implementation. All of the Pascals,
including those that were *not* targeting embedded systems, used
non-standard, non-portable mechanisms for allowing access to symbols
defined in separately compiled modules.


Possible. Yet I had the same experience.

I didn't choose my compilers well (VC++, gcc (with BSD libc code) and C51),
and couldn't get the code to work universally.

Is it actually the standard that is the problem, or your choice in
compilers and code? Maybe just because Pascal got its finest hour a bit
early, and you had legacy code bases from pre standard times to maintain?

And why do you lay so much emphasis on that one item, that is severely
limited (in all implementations) and broken (in original K&R) actually?
> C & C++ are two distinct languages, yet you can link them relatively
> easily...


C++ nearly includes C, even though they are formally separate
languages.


No, C++ will report diagnostics and probably abort compilation when
trying to compile a *lot* of ANSI standard C. For example, if the result
of a malloc call is not cast C++ will reject it where as not casting it
is the recommended approach in C.


That's why I said "nearly".
portable :-) You probably mean that C++ and C FROM THE SAME VENDOR
reasonably link well.


The C++ standard explicitly defines some of how the linking of C and C++
is to be handled, such as the 'extern "C"' stuff you see in a lot of
headers. The C standard cooperates to the extend of guaranteeing a way
of identifying at compile time whether a file is being compiled as C or
C++ to allow you to share header files.


Ah. So it works if I take a totally isolated C compiler? How much namemangling
is defined (and guaranteed) by the standard?

You only signal something with extern C. It's up to the implementation
to do something with it *and* both *implementations* actually have to match.
OTOH that is not a problem. Most Pascal compilers also link to C. They
probably also can link to eachother, only on a deeper level (directly
passing file handles instead of file descriptors etc)


However, there is no way defined to specify that external objects are
external C objects, unlike with C++. There is also no way to include a C
header file from a Pascal source file.


(Is there a way to import a Pascal module into C then?)

That's because Pascal doesn't include nearly the entire C language as I
already said.

Are you btw sure it is C++ compatibility with C, or simply C++ and C being
usually the same compiler or variants of the same compiler that makes this
work?

IOW, does it work if I choose a C++ compiler "B" that has different
structure aligning rules than C compiler "A"? And will their runtime be
compatible?

Anyway so one has to convert the C headers into the Pascal syntax, and
either use non standard (but pretty common) extensions like modifiers that
flag it like a certain "C" compiler (again e.g. alignment, but also base
type sizes, no), or putting it in separate modules and compile them with
special parameters. There are some other ways too (like including a "C" module
that does this automatically). Frankly this is not really the problem.

Yes, that wouldn't be much of a problem if C was clean and parsable, so that
one could convert headers automatically. However it isn't. It has a macro
processor in which a game of tetris was implemented. Nuff said.

If you preprocess to kill the macro's, you effectively don't have a header
anymore. I

(Btw, you have IMHO hit a major problem with Unix here.

You can't determine the API without having a full C compiler conforming to
the exact implementation of the rest of the system. No wonder why configure
is such ugly hack. Microsoft tries to work on this by imposing strict
guidelines, and specifying a third in IDL. On *BSD and Linux it is near
impossible to process/convert headers. I looked into Solaris headers today,
and my first impression was that they were on the same level as *BSD)
In practice yes. But the Borland group, while it far outnumbers ANSI,
is x86 (and often even x86/win32) centric. So if you go outside x86,
you'll encounter quite a lot of ansi.


Whatever processor you will find a lot of ANSI standard C, even for the
x86/DOS/Win world.


I was speaking of Ansi Pascal above. And I was a typical Borland Dos user
before migrating to Unix in the early/mid nineties. Believe, the
bulk was Borland/Microsoft specific code, and it still is.

Even if the language is close enough to ansi C(++), the amount of extensions
and libs used in the avg code is simply flabbergasting.
> I bet you can use a lot of conforming C code not written for the
> 8051 on an 8051, since I bet there is a conforming implementation.


Maybe, but that would really strain those 256 bytes of memory.


I just checked, and version 7 of C51 claims ANSI conformance.


Nice, you have an URL? I can't check what version I have right now.

And is it for the actual 8051, or for compatibles that are a lot "richer"?

Yes. It is a very limited help of trying to verify that. But in
practice, the standard is often not enough to build an average
application.


I've written large C applications for real world problems as part of my
job where only a small amount of isolated code was implementation
dependant.


Ah sure, but _new_ code is never the problem. Pick the formal standard (or
in Pascal sometimes the de-facto standard Borland), and gone is the problem.

Post in a Pascal group, and anybody will tell you exactly the same large
story for Pascal. And they all have codebases they port without problems.

But somehow, each time if I'm called to work on a codebase, be it C(++),
Java (!) or Pascal, and the project is non trivial, it is a mess. Who knows,
maybe it is Karma, and I was Charles Babbage in a former life, and am being
punished for never finishing the analytical machine.

I'm actually pretty deep involved in porting a Pascal compiler (Delphi
dialect, so pretty advanced) to as many platforms as possible. That alone
is a codebase of 3MB pretty portable Pascal. (that's the compiler. total
project size 50-100 MB, but there are a lot of headers in there)
fix nearly every program when I got an Alpha machine)

But except for that (and those original K&R code), you are somewhat
right, the problem is mroe that a fully standard C program is often
trivial, and no real app.

Something like that is nice to show to students, but not something for
IRL.


As I say, I work on large *real* applications using standard C for 90%
(or more) of my work. Therefor it is useful for *real* work on *large*
projects.


Sure. I know they exist. But it is the same for each other language. Java,
Pascal, you name it. One can be lucky (specially if you set it up yourself at
a time you already got some clue), but that is not the average situation.

The average situation is either code so old that it probably originates from
the Analitical Machine, and/or uses every dirty trick in the book.

See the Java discussion. Nice on paper, but in practice it gets you nowhere.
At least not as far as it claims.
> So do the majority of modern Pascal implementations support ANSI
> standard Pascal?


Either that or Borland. Borland is proprietary, but so dominant that
smaller vendors (TMT,VP) follow it, and it also has following in the
open source community.


So, you still can't share code with embedded systems. I can and *have*
done so for *real* work on *complex* applications.


Depends on your view of embedded. That 8051 with its 256 would be hard
(though I want to see an C avg app compiled for it too).

But I run Delphi code on my m68030. The 68k implementator is still fondling
the cg a bit to work on plain 68000's, but it would work.

I also have a Pascal compiler for my c64 that allows small programs to be
made, and here is the "51" variant:
http://www.geocities.com/SiliconValley/Campus/9592/
(it has an extension for external procs I saw)

However Pascal _is_ a bit more high level. Usually it requires more memory,
both for code and runtime, at least if you maintain your own programming
style. Don't forget Pascal is older than C.

However you can get by that if your compiler is some what smart (dead code
optimisation, inlining small funcs), if you don't mind being set back
to the C level. The only real limitation I quickly can think of is local
variables in the middle of a block. (though in theory you could try to
put that part of the code in a inner procedure, and have that inlined. Not
guaranteed, but C doesn't guarantee compilation efficiency either)

Delphi is even worse in that department (compared to C++), and much

Sure, but that was not what I said. I said the compilers can't compile
an average program from the other.


Berkeley DB (the one used to drive the Amazon web site, amongst others)
is built using gcc for Linux and VC++ for Windows. Maybe that is a well
written program rather than an average program.


Is it the original version? I might actually have an old one on my BSD 4.3
tapes, if I had a device to read them. Let's see if gcc3 eats that. I can
adapt any codebase through time.
The application I work on, several hundred thousand lines of code, used
to be built for Windows using VC++, for Linux using gcc and for HPUX,
SCO, AIX and Solaris using the standard compilers from the relevant OSs.
This is using the *same* source files in all cases. It is also software
that my company earns millions from annually.
See comments earlier. I have met with codebases like that too. (actually
in Modula2, but that's close enough)
I changed this to standardising on gcc because I wanted to and gcc is
available for all the targets we want to support.
Why standarise if you could already compile them with all those compilers.
If you don't have to change a single char to have a code base for target
A run on target B, why would you?
> (sometime I always try to do) then the bulk of your code will
> compile and run correctly on both.


And be fairly trivial.


Do you think an applications several hundred thousand lines long is
trivial?


No. There are such cases. A compiler is a good example. BDB also, because
it only uses standard files.
Or Berkeley DB which is used to power a lot of major web sites?


Is that actually plain standard BDB, or something that only has its origins
in that. Hard to do that with plain C, can't even create a critical section
or so. Deadlocks all over the place :-)

Or are you confusing POSIX and Ansi C?

[time's up. Sorry]
Jul 19 '05 #226
In article <20******************************@flash-gordon.me.uk>,
Mark Gordon <sp******@flash-gordon.me.uk> wrote:
just to name 4 families of processor off the top of my head all of
which are used in *current* projects.

Even if you ported the JVM it would run like a snail on
tranquilisers.
I used the Pascal byte code system, precursor and work-alike to modern
Java, on the 6502 and Z80. And yes, it did run like a snail on
tranquilisers. For that matter, so did highly tuned assembler and
compiled C.

Anyone who wants to license a reference implementation from Sun can do
so for $150,000, IIRC, and write a very think hardware layer to get
Java on thier chip/os.


That is a lot of money to spend compared to the cost of a C
implementation and it probably still won't help you for all the target
I mentioned.


It is a lot of money. But the last time I checked 45-odd OS vendors
(SCO being one of them) has spent it to enable a JVM and Standard Java
(J2SE) on their OS. You are looking to use a tiny, underpowered, way
out of date chip and asking for future reuse in 10 years. It just isn't
going to happen -- the 6502, even with the billion-plus that have been
sold is not good for much more than controlling a stop light these days.

The origin of this thread was someone looking for a language to
(re)write an enterprise application in with an eye to being able to use
it in 10 years.

For the same reasons that I don't worry about being able to run
Enterprise Cobol and DB2 on an IBM-7090 machine -- I don't think it is
overly reasonable to worry about running a full featured JVM on a 30
year old version of a 65xx or Z80 chip.

I can say with great certainty that there are no enterprise data centers
running on any of the afore mentioned chips. Not even any desktops
anymore.

Taking 20K for the JVM when you have a limited memory space could be a
problem. It would not run on the systems where we only had 8K of ROM and
2K of RAM.
Embedded is a different world from enterprise, but given the cost of a
6502 (about USD$0.25 when I last checked 10 years ago) and the cost of
20k of RAM (even less), I'm not certain I would lose sleep over it
unless i was shipping a huge number of units.

Still, as you pointed out -- the snail on Qualudes looks quite zippy. I
would think that given the limited memory space and the slow processor
speed that you would not want any library code and would want everything
hand-tuned.

One of the reasons I recommend Java as 'portable' for non-embedded use
is the vast, standard, binary-compatible library that is available in
every installation. C/C++ doesn't have anything like that very rich
library availabe on every non-embeded computer from lowly palmtop to
huge mainframe. Where such libraries do exist for C/C++ they require at
the very least a recompile of the application (best case) or a very
serious rework of several things followed by plenty of debugging (usual
case) in order to move code across OS/machine combinations.

And finally, there is no C/C++ compiler and library that is completely
compatible over the 45ish platforms that Java is. I'm not even sure
there is a C/C++ library that is a complete as the J2SE, though perhaps
I have missed some.
Real reviews are mandated on projects for the military and for safety
critical projects. -and- The application I am currently working on is several hundred thousand
lines of code and it runs on both big and little endian machines and
both Windows and a variety of Unix derivatives.

For the binary files the original author just chose one endianness and
wrote some code to explicitly read the files as bytewise as that
endianness.


I'm glad to hear that you do the reviews on the safety and military
projects, but you are in the minority. Take any Micro$oft product
(please) -- does their rate of fatal flaws make you think they do
reviews of any sort?
Jul 19 '05 #227
Peter E.C. Dashwood wrote:

We may have our own favourite Languages and we can poddle away in a corner
somewhere cutting code for the fun of it, but the real world demands that it
get solutions. By 2015 a new generation of development software will see
"programmers" removed from the loop and end users interacting and iterating
with smart software until they get what they want.
Sounds like something big, clumpsy, slow and limited to some narrow set
of tasks, that one could just as well do without.
Procedural code is already into Gotterdammerung. It takes too long, requires
too much skill, is too inflexible (the accelerating rate of change in the
Marketplace and in technology is another reason why it is doomed to
extinction) and, overall, costs far too much.


I have a different experience. Whenever I have a complicated data
evaluation task, I write a little pascal program to parse the data
(which sometimes come in a strange format out of lab equipment) and do
the necessary calcualtions. I keep a library of common task like
integrating statistical distributions and the like, which can be
recycled.

This is frequently easier than using a spreadsheet for that purpose,
which was supposed to be the easy solution for people who can not or do
not want to programm.

Such programs can be quick and dirty jobs with the charme of a
Unix-filter, as I am the only one using them. For that reason I am even
disappointed by the development that Delphi/Kylix has taken over the
last 10 years: Bigger, more complicated and worse documented. It may
help profesional programmers who need to make nice user interfaces. But
little guys like me who just need a job done were better off with good
old Turbo Pascal.
Jul 19 '05 #228

"JC" <Ka***********@hotmail.com> wrote in message
news:bh**********@titan.btinternet.com...
Peter E. C. Dashwood <da******@enternet.co.nz> wrote in message
news:b3************************@posting.google.com ...

I have little to add.

You say you're angry. So was I. I'm tired of your cheap shots.

I suggest the best policy for both of us is to simply ignore each
others' posts.

Please don't quote or respond to my mail in your posts, and I'll adopt
the same policy with yours.

Hollerith Cards at dawn??

LOL! I guess that would be quite appropriate, however, I refuse to engage in
a battle of wits with someone who is completely unarmed...<G>

Pete.

Jul 19 '05 #229
Roedy Green wrote:
....
the JVM itself is written in C++,


What are you talking about? There is no *the* JVM.

Jirka

Jul 19 '05 #230
<top-posted on purpose>

Hey, guys, please remove comp.lang.c from your distribution list. We
have our own brand of flames, and yours are off-topic.

On Tue, 19 Aug 2003 22:47:35 +1200, "Peter E.C. Dashwood"
<da******@enternet.co.nz> wrote:

"JC" <Ka***********@hotmail.com> wrote in message
news:bh**********@titan.btinternet.com...
Peter E. C. Dashwood <da******@enternet.co.nz> wrote in message
news:b3************************@posting.google.com ...
>
> I have little to add.
>
> You say you're angry. So was I. I'm tired of your cheap shots.
>
> I suggest the best policy for both of us is to simply ignore each
> others' posts.
>
> Please don't quote or respond to my mail in your posts, and I'll adopt
> the same policy with yours.
>


Hollerith Cards at dawn??

LOL! I guess that would be quite appropriate, however, I refuse to engage in
a battle of wits with someone who is completely unarmed...<G>

Pete.



--
Al Balmer
Balmer Consulting
re************************@att.net
Jul 19 '05 #231
On Mon, 18 Aug 2003 19:23:23 GMT, "James J. Gavan" <jj*****@shaw.ca>
wrote:


Alan Balmer wrote:
On Sun, 17 Aug 2003 22:53:38 GMT, "James J. Gavan" <jj*****@shaw.ca>
wrote:
>If we don't hear from each language, may we assume that particular language
>doesn't have all the necessary tools ???? <G>
No, but you could assume not everyone wants to play your game.


Why assume a game was being played ?


It was a legitimate deduction from those portions of your
mini-flamefest which are visible from comp.lang.c. I, for one, would
appreciate it if no more of this thread were visible from c.l.c.
Surely the other four newsgroups you cross-posted to are a wide enough
audience?
It was a legit question, what techniques do
you use within your language to handle such a problem - I'm not 'voting' for one
outcome - a genuine enquiry as to how OO programmers would handle such, from the
features they have in their respective languages.

Please don't read into the message an alternative meaning. (If I had a hidden
agenda, be damned sure I would have asked the question in a very different
fashion).

Jimmy


--
Al Balmer
Balmer Consulting
re************************@att.net
Jul 19 '05 #232
Alan Balmer wrote:
<top-posted on purpose>

Hey, guys, please remove comp.lang.c from your distribution list. We
have our own brand of flames, and yours are off-topic.


If goose is going to post in ours, we're going to post in yours. ;)
--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~
~ / \ / ~ Live from Montgomery, AL! ~
~ / \/ o ~ ~
~ / /\ - | ~ LX*****@Netscape.net ~
~ _____ / \ | ~ http://www.knology.net/~mopsmom/daniel ~
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ I do not read e-mail at the above address ~
~ Please see website if you wish to contact me privately ~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~

Jul 19 '05 #233
Alan,

I have nothing more to say that needs to be cross posted (either on-topic or
off-topic) and I'm sorry if these very short posts bothered you.

Just as a matter of interest, is this a moderated forum? I was under the
impression that the language forums are unmoderated and uncensored.

If I'm wrong about this, I sincerely apologize.

Either way, I'm unlikely to post here again.

Pete.
"Alan Balmer" <al******@att.net> wrote in message
news:15********************************@4ax.com...
<top-posted on purpose>

Hey, guys, please remove comp.lang.c from your distribution list. We
have our own brand of flames, and yours are off-topic.

On Tue, 19 Aug 2003 22:47:35 +1200, "Peter E.C. Dashwood"
<da******@enternet.co.nz> wrote:

"JC" <Ka***********@hotmail.com> wrote in message
news:bh**********@titan.btinternet.com...
Peter E. C. Dashwood <da******@enternet.co.nz> wrote in message
news:b3************************@posting.google.com ...
>
> I have little to add.
>
> You say you're angry. So was I. I'm tired of your cheap shots.
>
> I suggest the best policy for both of us is to simply ignore each
> others' posts.
>
> Please don't quote or respond to my mail in your posts, and I'll adopt > the same policy with yours.
>

Hollerith Cards at dawn??

LOL! I guess that would be quite appropriate, however, I refuse to engage ina battle of wits with someone who is completely unarmed...<G>

Pete.



--
Al Balmer
Balmer Consulting
re************************@att.net

Jul 19 '05 #234
> Such programs can be quick and dirty jobs with the charme of
a
Unix-filter, as I am the only one using them. For that reason I am even disappointed by the development that Delphi/Kylix has taken over the last 10 years: Bigger, more complicated and worse documented. It may help profesional programmers who need to make nice user interfaces. But little guys like me who just need a job done were better off with good old Turbo Pascal.


I hear what you are saying -- it is true of so many
environments.

As for Delphi, though, I've found it simple to create projects
that don't use any additional libraries -- that accounts for
about half of my Delphi projects. Tidy (but not necessary
tiny) console applications that rarely break 50k. To each his
own, but I don't want to give up my windows IDE, even if I'm
not programming a windowed application.

....just an unsolicited two cents.
Jul 19 '05 #235
Please stop crossposting to comp.lang.pascal.ansi-iso. This topic is not
relivant to that group.

Thank you.
Jul 19 '05 #236
In article <bh*********@enews1.newsguy.com>, Grinder wrote:
Such programs can be quick and dirty jobs with the charme of

a
Unix-filter, as I am the only one using them. For that reason

I am even
disappointed by the development that Delphi/Kylix has taken

over the
last 10 years: Bigger, more complicated and worse documented.

It may
help profesional programmers who need to make nice user

interfaces. But
little guys like me who just need a job done were better off

with good
old Turbo Pascal.


I hear what you are saying -- it is true of so many
environments.

As for Delphi, though, I've found it simple to create projects
that don't use any additional libraries -- that accounts for
about half of my Delphi projects. Tidy (but not necessary
tiny) console applications that rarely break 50k. To each his
own, but I don't want to give up my windows IDE, even if I'm
not programming a windowed application.


Bloodshed Dev Pascal (site: bloodshed.net). this is a Windows IDE on top of
the Free Pascal compiler, and that IDE is programmed in Delphi, and source
is available.
Jul 19 '05 #237


LX-i wrote:

Malcolm wrote:
Java COBOL and Visual Basic I know little about. VB is unstable, COBOL is
virtually obsolete.


What? COBOL is obsolete? I guess OO and .NET are obsolete too... ;)


Well, if they are working, in several years they won't even compile
the original source!

I purchased M$ Pascal for a college class, but the extra code to handle
text was larger than the code for the class assignment. M$ used non-ANSI
formats. The same for C and C++. I had to buy Borland to use it at home
for classroom assignments. M$ changed the C/C++ so much I couldn't use
it a year later ($300 down the tubes). From the thread, it sounds like
VB is in the same condition.

I can still compile and execute my COBOL programs from 15-30 years ago,
and have ported them to Microfocus and Fujitsu, with very little effort.

Whatever language you pick, if M$ has the standard implementation, it
WILL change, based on marketing impact and not functionality or
industry/ANSI standards.

Gary

Post to the group, this email address is forwarded to uc*@ftc.gov
or userid=gdrumm at ont dot com to contact me direct.
Jul 19 '05 #238
Joona I Palaste <pa*****@cc.helsinki.fi> wrote in message news:<bi*********@oravannahka.helsinki.fi>...
goose <ru**@webmail.co.za> scribbled the following
on comp.lang.c:
Bat Guano <bat.guano@talk21dotcom> wrote in message news:<vj************@corp.supernews.com>...
goose wrote:
> but java is only available for platforms that are big enough to run
> it.

big like my mobile phone?

whats your point ? that java rnus on your mobile phone ?

<NEWS FLASH> C probably targets that too </NEWS FLASH>

and it also targets many that java does not run on ?

so what exactly *is* your point ? java runs on a *fraction*
of platforms that C targets.


does your mobile have under a K of ram ?

thought not


Do you know of implementations of C that run in under a K of RAM, then?


no, but if you find one, let me know :-)

otoh, I know of more than just a few freestanding c implementations
that target (but do not run on) many machines with less than a K of
ram.

goose,
Jul 19 '05 #239
ro****@wagner.net (Robert Wagner) wrote in message news:<3f***************@news.central.cox.net>...
Tom McGlynn <Th**************@nasa.gov> wrote:
One site, http://www.techiwarehouse.com/Cobol/, had
some interesting statements about COBOL though. No source is
given so take them with as much salt as you like. The one
that caught my eye was:

15% of all new applications (5 billion lines) through 2005 will be in COBOL.


The site also says:

"Replacement costs for COBOL systems, estimated at $25 per line, are in the
hundreds of billions of dollars."

"There are 90,000 COBOL programmers in North America in 2002. Over the next four
years there will be a 13% decrease in their number due to retirement and death."

Lines per day published elsewhere are usually 12. (FWIW, shops I managed
averaged 50 lines per day.) Taking 12 lines * $25 / 8 hours = $37.50 per hour.
That's reasonable.

Taking 15% of 4.6B lines / 3 years / 250 days per year / 82K programmers = 11.3
lines per day. Check.

I earlier estimated 40,000 mainframe COBOL programmers in the US. The 82,000
figure given here looks better because it's corroborated by the other numbers.
Further, 82,000 COBOL programmers / 600,000 total US programmers = 13.7%, which
agrees with the 15% distribution by language.

Canada's population is 9% of North America (US + Canada). I adjusted the number
of programmers and lines by .91 to simplify comparison with US statistics.

The Web site goes on to say, without corroboration:

"The most highly paid programmers in the next ten years are going to be COBOL
programmers."

Why? The price of most human labor is determined in the same way as other
commodities -- by supply and demand. It is not based on some imagined measure of
worth or difficulty (excepting executives). Let's look at the demand side.

Computerjobs.com, which provides handy statistics, categorizes as Legacy 500 /
12,000 = 4% of openings. Some of the 12,000 'technology' jobs are
non-programmers, optimistically as many as half, so 8%. DICE.com shows 550=COBOL
/ 7,000=(developer or programmer) = 8%. Monster.com returns nearly identical
numbers.

Why is the demand for COBOL 8% rather than 15%? Because COBOL programmers stay
in their jobs longer than average? Because companies using COBOL, generally
large old-line ones, are creating jobs more slowly than average?

Although I agree that demand for COBOL application developement will
continue to grow, if what one Indian consulanting firm told me, a few
years back, is true,COBOL programmer rates probably will probably
drop in North America.

The Indian consulting firm claimed they were training over 20,000
cobol programmers a year in India. In the few Fortune 500 firms I
have worked at any new COBOL developement has been done by consulting
firms with Indian programmers. Maintenance and productions support is
being done by the US staff.
Jul 19 '05 #240
Please stop crossposting to comp.lang.pascal.ansi-iso.

Thank you.
Jul 19 '05 #241
Please stop crossposting to comp.lang.pascal.ansi-iso.

Thank you.
Jul 19 '05 #242
Scott Moore wrote:
Please stop crossposting to comp.lang.pascal.ansi-iso.

Thank you.


You know, the *only* messages I've seen here in comp.lang.c++ from this
thread are the ones from Scott Moore asking someone (i have no idea who,
since he never bothers to quote any context) to stop cross posting to
clpa-i. Now this may be due to my crappy news server, or to my
early-beta news client but even so...

Scott, maybe *you* should stop cross posting to comp.lang.c++?

-Kevin
--
My email address is valid, but changes periodically.
To contact me please use the address from a recent posting.

Jul 19 '05 #243

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

253
by: James Cameron | last post by:
Hi I'm developing a program and the client is worried about future reuse of the code. Say 5, 10, 15 years down the road. This will be a major factor in selecting the development language. Any...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.