473,882 Members | 1,663 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Best C++ compiler for DOS programs

I have to develop several large and complex C++ hardware test programs that
should work under DOS, most likely with 32-bit DOS extender. Development
workstation OS would be Microsoft XP. Quite some time ago I worked in DOS,
with Borland BC++ 4.1. I do not have it any more. Which compiler would you
recommend me now? Which ones support serious DOS program development?
Criterion should be number of available free library modules (graphic menu
system, mouse driver, I/O), ease of development, price (maybe free?),
current and future support. If compiled program can work in DOS window under
XP, at least for some early testing, that would be fine also.

So far I have found free Watcom, Digital Mars and DJGPP compilers

http://www.digitalmars.com/
http://www.openwatcom.org/index.php/Main_Page
http://www.delorie.com/djgpp/

Which one of these, or other free compilers, is best? What about commercial
compilers? Are they worth the money for DOS development? What would you
recommend me?

Steve.

Apr 22 '06
55 12873
P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:eJ******** *************** *******@comcast .com...
P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
STL isn't very usable for 16 bit code, it's just too big.
News to me, and to many of our embedded customers. STL itself is
weightless. How big the memory footprint is depends on just those
functions you choose to use. One of the problems templates have (and STL is thoroughly based on
templates) is that it can go too far with customization, thereby
generating bloat. For example, I use a (non-template) linked list package
that creates a list of 'int' items. I can use it to store lists of
unsigned, shorts, unsigned shorts, char types, near pointers, etc.,
without adding any code. But if it was templatetized, a separate
implementation would be generated for each type.

This isn't a problem for 32 bit code generation, where there's lots of
room for the extra code. But it *is* a problem for 16 bit code, where your
code and data have to fit in 640Kb.


Yep, that's the standard bogey man trotted out by people leery of
templates. In real life, most people don't use eleven different map
types, with eleven versions of tree-walking code. In a sub-megabyte
program, it's not likely they'll need even two.


In today's world, a sub-megabyte program is a trivial program, and I
would agree with you. But in the 16 bit DOS days, this was not true at
all. A 250K program could be extremely complex. My compiler, for
example, had to be split into 3 passes, and there was lots of various
list types and tree-walking code in it, and it benefited substantially
(and critically) from being able to reuse existing object code as much
as possible. Reusing source code (what templates do) was relatively not
so important.

And in real life,
very little code in STL benefits from distilling out in parameter
independent form. As always, the rule should be, try it first to see
if it's *good enough*. If so, you're done, and way earlier in the day.
Don't optimize for speed or space until you know you have to.


That is a good rule. But in the 16 bit DOS world, you have to start
optimizing for speed/space often right out of the gate, as the limits
were reached very quickly for non-trivial programs. If your program was
going to use more than 64K of data, you had to design that in from the
start, not retrofit it in later. Programs were also far more sensitive
to such optimizations then than today - I don't believe languages like
Ruby or Python would have enjoyed widespread success on those machines.
And remember that early Java implementation - the UCSD P-system? There
was a setup years before its time.
Apr 24 '06 #31
Rod Pemberton wrote:
If you use DEGFX instead of Allegro, under the DEGFX directories there is
a DJGPP directory. There are four files. Three are small. I think, but
am not sure, that these are the only files that need ported. It appears
to me that these are mostly DPMI calls or below 1Mb memory accesses
(farpeek's etc..). It's fairly straightforward but time consuming to
port these. I also see some packed structs and use of DJGPP transfer
buffer.
............... ..

DJGPP packed structs would need to be rewritten:
............... ..

The DJGPP transfer buffer can be setup for PM Watcom, get_dos_mem() is
called to setup __tb, and free_dos_mem() when done:
............... ...

Tanks for info. I'll see what I can do.

Steve
Apr 25 '06 #32

"Steve" <St**@nospam.co m> wrote in message
news:e2******** **@news.eunet.y u...
Rod Pemberton wrote:
If you use DEGFX instead of Allegro, under the DEGFX directories there is a DJGPP directory. There are four files. Three are small. I think, but am not sure, that these are the only files that need ported. It appears
to me that these are mostly DPMI calls or below 1Mb memory accesses
(farpeek's etc..). It's fairly straightforward but time consuming to
port these. I also see some packed structs and use of DJGPP transfer
buffer.
............... ..

DJGPP packed structs would need to be rewritten:
............... ..

The DJGPP transfer buffer can be setup for PM Watcom, get_dos_mem() is
called to setup __tb, and free_dos_mem() when done:
............... ...

Tanks for info. I'll see what I can do.


I ported an early version of Chris Giese's LBA routines from DJGPP to OW.
It should show you the differences between calling the DJGPP __dpmi
functions and using the DPMI RMI structure for OW. It also has the _tb
code, etc...

http://www.openwatcom.org/index.php/...LBA_under_DPMI
Rod Pemberton
Apr 25 '06 #33
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:WN******** *************** *******@comcast .com...
P.J. Plauger wrote:
> Another problem is it has no accommodation for near/far.
Also news to me. We've preserved the near/far notation that H-P put
in the earliest allocators. Not used much, AFAIK, but it's there.
Having it in there doesn't mean it works very well. Effective large
model programs need careful management of which segments each function
goes in to, when things can be near, when things can be referred to by
__ss pointers, etc. Templates aren't conducive to this, and neither is
just throwing in a near allocator. Right. But sometimes it works *well enough*.


I remember well the 80's. Lots of people ported unix utilities to 16 bit
DOS. Those utilities were designed for 32 bit flat code, and whether they
worked "well enough" on DOS was certainly a matter of opinion. They
usually got stomped by utilities and applications that were custom crafted
for the quirks of 16 bit computing.


As the guy who did the first rewrite of Unix, I can attest that it ran
just fine on 16-bit computers. We also ported our utilities to other
platforms, including DOS. To this day, I still use quite a few of those
utilities in house to build the packages we ship. So IMO they work
"well enough". YMMV.
> Although DMC++ does
> implement exception handling for 16 bit DOS, even that is more of a
> technological feat than a practical one - due to size constraints, I'd
> recommend using the older error code technique instead.
We provide a simplified mechanism for those who choose to compile
with execptions disabled, but once again it's a matter of taste.
And particular needs.
I'm curious what 16 bit C++ compiler you're using that supports
exception handling.


I defer to our OEMs.


I ask the question because I don't know of any 16 bit C++ compiler that
supports either modern templates or exception handling, besides Digital
Mars C++.


See http://www.iar.com, by way of example. They use the EDG front end, our
EC++ and Abridged libraries, and a host of their own 8-, 16-, and 32-bit
back ends. The Abridged Library supports templates, which don't require
back-end support (other than huge names in the linker). I don't know which
IAR back ends support exceptions.

IAR is one of about a dozen of our OEM customers who supply C/C++
compilers for the embedded marketplace.
> Exception handling, STL, etc., are much more practical in a 32 bit
> system.
More precisely, *large programs* are much more practical in 32-bit
systems. Neither exception handling, nor STL, nor etc. are
intrinsically
too big to be of use in some programs for 16-bit processors.
A large part of the effort in developing 16 bit programs was always
spent trying to squeeze the size down. Exception handling adds a big
chunk of size, which will just make it that much harder, and so will
actually reduce the complexity of a program you can build for 16 bits.

Not necessarily. You can trade time vs. space for exception handling, and
I've seen both extremes.


The two main schemes for doing exception handling are:

1) Microsoft style, where runtime code is inserted to keep track of where
one is in a table of destructors that would need to be unwound

2) Linux style, where the PC is compared against a static table of
addresses to determine where in the table one is

Both involve the addition of a considerable chunk of code (1) or data (1
and 2). Under (2), that chunk consists of data that isn't actually needed
unless an exception is thrown. This is an efficient implementation under a
system that has demand paged virtual memory, where executables' pages are
only loaded from disk if the address is actually referenced.

This is not the case for 16 bit DOS, which *always* loads the entire
executable into memory. DOS doesn't have demand paged virtual memory. 32
bit DOS extenders do add demand paged virtual memory, but only for 32 bit
code, not 16 bit.

Hence, the exception handling bloat is always taking away space from that
precious 640Kb of memory. I suppose it is possible for the compiler/linker
to write the exception handling tables out to a separate file, but I've
never heard of an implementation that did that.


Right. All I'm challenging is whether your "considerab le chunk" of "bloat"
is so excessive as to make C++ completely unusable in the sub-megabyte
domain.
STL adds another chunk of size, if only because it doesn't allow tuning
of near/far. I'm not as convinced of the lightweightness of STL as you
are, and iostreams in particular seems to add a huge amount of code even
for simple things.

Ah, I see part of the communication gap here. By STL *you* mean "the
Standard C library", while *I* mean "that set of containers and
algorithms
based heavily on the Hewlett-Packard Standard Template Library".


I mean STL as in "C++ Standard Template Library."


Then why do you refer to "iostreams in particular", which is not a part
of STL?
We avoid
the iostreams bloat by offering EC++ (as well as the full Standard C++
library), which looks more like the original cfront iostreams than the
full bore templated and internationaliz ed thing that got standardized.
Our Abridged Library consists of EC++ with STL bolted on.


Digital Mars C++ for 16 bits does offer both of the two older
implementations of iostreams (iostreams went through a couple major
redesigns before being standardized). These work tolerably well on 16 bit
platforms, but they are not Standard C++ iostreams by any stretch of the
imagination.


Whereas istream/ostream/fstream etc. in EC++ is often indistinguishab le
from the Standard C++ version. It is, in fact, the subset of iostreams
that most people use most of the time.
Using C stdio for 16 bit programs is best because many
years were spent optimizing it to get the size down (some vendors even
implemented printf entirely in assembler!), and such effort was never
expended on iostreams. Well, it was by us. I agree that stdio can be smaller, particularly if
you use a bespoke printf that omits floating-point when you don't need
it. But once again, EC++ has proved repeatedly to be *small enough*.


From http://www.dinkumware.com/embed9710.html:
-----------------------------------
What's Not in Embedded C++
Embedded C++ is a library specification and a minimum language
specification. The minimum language specification is a proper subset of
C++, omitting:

multiple inheritance and virtual base classes
runtime type identification
templates
exceptions
namespaces
new-style casts
------------------------------------

EC++ being practical for 16 bit targets does not imply that templates and
exception handling are. EC++ is kinda what C++ was back in 1991 or so,
when it worked well on 16 bit targets.


You've described EC++, as specified in 1997. It restricted the language
to give existing (pre-standard) C++ compilers a fighting chance. But
the existence of off-the-shelf complete front ends like EDG have made
that aspect of EC++ way less important. Our most popular embedded
product is the Abridged Library, which relaxes *all* of the above
language restrictions. It's the Standard C++ library that eats space
and time, so the simplified EC++ library iostreams, string, etc. offer
the most significant savings.
Do you know anyone using STL (Standard Template Library) for 16 bit X86
programming? I would be surprised if there were any. I looked around on
the Dinkumware site, but didn't find anything specifically mentioning 16
bit support or any particular 16 bit C++ compilers, but perhaps I missed
it.


See above.

P.J. Plauger
Dinkumware, Ltd.
http://www.dinkumware.com
Apr 25 '06 #34
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:k6******** *************** *******@comcast .com...
P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:eJ******** *************** *******@comcast .com...
P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
> STL isn't very usable for 16 bit code, it's just too big.
News to me, and to many of our embedded customers. STL itself is
weightless. How big the memory footprint is depends on just those
functions you choose to use.
One of the problems templates have (and STL is thoroughly based on
templates) is that it can go too far with customization, thereby
generating bloat. For example, I use a (non-template) linked list
package that creates a list of 'int' items. I can use it to store lists
of unsigned, shorts, unsigned shorts, char types, near pointers, etc.,
without adding any code. But if it was templatetized, a separate
implementation would be generated for each type.

This isn't a problem for 32 bit code generation, where there's lots of
room for the extra code. But it *is* a problem for 16 bit code, where
your code and data have to fit in 640Kb.
Yep, that's the standard bogey man trotted out by people leery of
templates. In real life, most people don't use eleven different map
types, with eleven versions of tree-walking code. In a sub-megabyte
program, it's not likely they'll need even two.


In today's world, a sub-megabyte program is a trivial program, and I would
agree with you. But in the 16 bit DOS days, this was not true at all. A
250K program could be extremely complex.


Huh? Why does a 250KB program suddenly get less complex? I agree that
code now freely sprawls because memory is so extensive and so cheap,
but it doesn't follow that a small program now *has* to be simpler than
20 years ago.
My compiler, for example,
had to be split into 3 passes, and there was lots of various list types
and tree-walking code in it, and it benefited substantially (and
critically) from being able to reuse existing object code as much as
possible. Reusing source code (what templates do) was relatively not so
important.
Huh again? If it's important, you do it. If it's not, and it costs you
productivity, you don't. Even today you can make one unified list type
do the work of two or three *if that is important to your code size*.
You get bloat only if you indulge in bloat (and you can afford it).
And in real life,
very little code in STL benefits from distilling out in parameter
independent form. As always, the rule should be, try it first to see
if it's *good enough*. If so, you're done, and way earlier in the day.
Don't optimize for speed or space until you know you have to.


That is a good rule. But in the 16 bit DOS world, you have to start
optimizing for speed/space often right out of the gate, as the limits were
reached very quickly for non-trivial programs.


But you "optimize" by picking a program design that fits the box, not
by fretting over potential code bloat that may or may not matter.
If your program was
going to use more than 64K of data, you had to design that in from the
start, not retrofit it in later. Programs were also far more sensitive to
such optimizations then than today - I don't believe languages like Ruby
or Python would have enjoyed widespread success on those machines. And
remember that early Java implementation - the UCSD P-system? There was a
setup years before its time.


We obviously have a different aesthetic, since I consider the P-system
an idea whose time had come and gone before it really hit the ground.
(Remember Softech Microsystems?) But that's wandering afield. The point
of this respons is, there's nothing intrinsic in exceptions, templates,
or C++ in general that prohibits their use in sub-megabyte systems.
Back in the 1980s people were still fretting over the 5-15 per cent
overhead you get when writing in C instead of assembler. C won, mostly
(IMO) because of the much greater productivity and in part because of
the steady increase in memory size and the steady decrease in memory
cost.

Now some people in the embedded world are fretting because of the
additional 10-20 per cent overhead when writing in C++ instead of C.
Memory is dirt cheap, so it's primarily architectural limitations (like
address size) that cause problems. If that overhead pushes you from a
16-bit to a 32-bit architecture, it's worth worrying about. Otherwise,
time to market trumps any piddling extra cost in storage, yes even
when you're making 10 million of 'em. Choice of programming language
is rarely black and white.

P.J. Plauger
Dinkumware, Ltd.
http://www.dinkumware.com
Apr 25 '06 #35

P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:k6******** *************** *******@comcast .com...
If your program was
going to use more than 64K of data, you had to design that in from the
start, not retrofit it in later. Programs were also far more sensitive to
such optimizations then than today - I don't believe languages like Ruby
or Python would have enjoyed widespread success on those machines. And
remember that early Java implementation - the UCSD P-system? There was a
setup years before its time.


We obviously have a different aesthetic, since I consider the P-system
an idea whose time had come and gone before it really hit the ground.
(Remember Softech Microsystems?) But that's wandering afield. The point


Hasn't hit the ground? That's an interesting viewpoint, in this day
and age of Java and .NET.

[snip]
Otherwise,
time to market trumps any piddling extra cost in storage, yes even
when you're making 10 million of 'em.


Maybe in niche markets. But in a competitive market, you'll need more
than just good enough. I think what illustrates the difference in
programming 16 bit vs. 32 bit, was when Lotus trounced all competitors
by writing their spreadsheet in 100% assembler, and wrote directly to
the video system, so as to extract every bit of power available from
the machine. In fact, one of their competitors was a P-system based
spreadsheet, Context MBA.

Apr 25 '06 #36
"Michael O'Keeffe" <mi*******@gmai l.com> wrote in message
news:11******** *************@j 33g2000cwa.goog legroups.com...
P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:k6******** *************** *******@comcast .com...
> If your program was
> going to use more than 64K of data, you had to design that in from the
> start, not retrofit it in later. Programs were also far more sensitive
> to
> such optimizations then than today - I don't believe languages like
> Ruby
> or Python would have enjoyed widespread success on those machines. And
> remember that early Java implementation - the UCSD P-system? There was
> a
> setup years before its time.


We obviously have a different aesthetic, since I consider the P-system
an idea whose time had come and gone before it really hit the ground.
(Remember Softech Microsystems?) But that's wandering afield. The point


Hasn't hit the ground? That's an interesting viewpoint, in this day
and age of Java and .NET.


The p-system was a failure for three big reasons (IMO):

1) It didn't have adequate performance on the processors of its time.

2) The interpreter, on a 16-bit system, left even less space for a
program.

3) It didn't deliver the one big thing you should get in trade for
the above -- adequate portability -- because the p-code didn't hide
from the endianness of the target platform.

Obviously, Java and .NET have avoided these problems and have each
established an important niche. The UCSD p-system made a splash that
lasted just a few years, by comparison. I stand by what I said.
Otherwise,
time to market trumps any piddling extra cost in storage, yes even
when you're making 10 million of 'em.


Maybe in niche markets. But in a competitive market, you'll need more
than just good enough.


Sorry, but in today's competitive marketplace any given "release" of
an embedded product might well sell for just a year or two. Plenty of
time and opportunity to fix the bugs for the next improvement, provided
you have a market for it. If you're three months late to that market,
however...

How well do you think the first iPod would compete if it were released
today?
I think what illustrates the difference in
programming 16 bit vs. 32 bit, was when Lotus trounced all competitors
by writing their spreadsheet in 100% assembler, and wrote directly to
the video system, so as to extract every bit of power available from
the machine. In fact, one of their competitors was a P-system based
spreadsheet, Context MBA.


Agreed. Lotus had to be written in assembly in those days to be "good
enough". That doesn't alter my basic point.

P.J. Plauger
Dinkumware, Ltd.
http://www.dinkumware.com
Apr 25 '06 #37
Have you heard of Linux? Solaris? Um, MIPS? Do you mean the only
Microsoft product that can give you accurate timing and access to low
level hardware? You can get access to low level hardware (registers,
buses, etc) with windows languages, like C and C++. I'm actually
confused, perhaps I misunderstood the lessons I learned over the last
ten years...Could you explain?

(yes, off topic, but I'm really confused).

-Benry
Steve wrote:
Thank you all for your answers.

It seems I should take a look at Watcom compiler first. They have many
target platforms and still seem very active. I assume that no latest
Microsoft or Borland C++ compilers support DOS program development. Now I
need to find some good graphical (not textual) windowing menu system library
that would, hopefully, work with Watcom. If you have some suggestion about
it, please, I would like to hear it.

Still, it was frustrating. I don't understand almost complete abandonment of
DOS program development in commercial compiler world. DOS is still, and will
be for the long time, the only platform for all programs that must have
exclusive access to hardware, or that must work alone for other reasons.
Real time applications and accurate timing is only possible in DOS. It is
ideal for hardware testing, as the controller platform, or PC hardware
malfunction detection, analysis and repair. If any important PC component is
malfunctioning, you better not try to boot other operating system because
hard disk data integrity may be compromised. Again, you must boot DOS and
run some hardware analysis program. But it is becoming harder and harder to
write programs for DOS!

Anyway, are you aware of some specialized news conference or blogs devoted
to writing various PC hardware test programs?

Steve.


Apr 25 '06 #38
P.J. Plauger wrote:
As the guy who did the first rewrite of Unix, I can attest that it ran
just fine on 16-bit computers. We also ported our utilities to other
platforms, including DOS. To this day, I still use quite a few of those
utilities in house to build the packages we ship. So IMO they work
"well enough". YMMV.
I'm pretty sure that although you may still be using those programs, you
aren't using them on 16 bit DOS <g>.
I ask the question because I don't know of any 16 bit C++ compiler that
supports either modern templates or exception handling, besides Digital
Mars C++.


See http://www.iar.com, by way of example. They use the EDG front end, our
EC++ and Abridged libraries, and a host of their own 8-, 16-, and 32-bit
back ends. The Abridged Library supports templates, which don't require
back-end support (other than huge names in the linker). I don't know which
IAR back ends support exceptions.

IAR is one of about a dozen of our OEM customers who supply C/C++
compilers for the embedded marketplace.


IAR doesn't seem to support 16 bit X86 - at least they don't list it on
their web site. Their page entitled "Extended Embedded C++" makes it
pretty clear they do not support exception handling, multiple
inheritance, or RTTI. They do support templates as well as
being "memory attribute aware", which is not elaborated.

Hence, the exception handling bloat is always taking away space from that
precious 640Kb of memory. I suppose it is possible for the compiler/linker
to write the exception handling tables out to a separate file, but I've
never heard of an implementation that did that.

Right. All I'm challenging is whether your "considerab le chunk" of "bloat"
is so excessive as to make C++ completely unusable in the sub-megabyte
domain.


I didn't say "completely unusable", though I will say it is impractical.
As evidence, no compiler (other than Digital Mars C++) seems to have
implemented it for 16 bit code. IAR is using the EDG front end, which
supports EH, but have apparently *removed* support for it for their 16
bit targets.

I mean STL as in "C++ Standard Template Library."

Then why do you refer to "iostreams in particular", which is not a part
of STL?


I've always considered it part of STL, after all, it is part of STLPort
(which is the STL that Digital Mars ships). If there is an official
definition of STL which excludes iostreams, so be it.

EC++ being practical for 16 bit targets does not imply that templates and
exception handling are. EC++ is kinda what C++ was back in 1991 or so,
when it worked well on 16 bit targets.

You've described EC++, as specified in 1997. It restricted the language
to give existing (pre-standard) C++ compilers a fighting chance. But
the existence of off-the-shelf complete front ends like EDG have made
that aspect of EC++ way less important. Our most popular embedded
product is the Abridged Library, which relaxes *all* of the above
language restrictions. It's the Standard C++ library that eats space
and time, so the simplified EC++ library iostreams, string, etc. offer
the most significant savings.


Ok, but IAR doesn't support exception handling, RTTI, or multiple
inheritance for 16 bit targets (they do support templates). Do you know
anyone (besides Digital Mars C++) that does?
Do you know anyone using STL (Standard Template Library) for 16 bit X86
programming? I would be surprised if there were any. I looked around on
the Dinkumware site, but didn't find anything specifically mentioning 16
bit support or any particular 16 bit C++ compilers, but perhaps I missed
it.


See above.


I checked the web site www.iar.com. They do not list X86 as a supported
target for their C/C++ compilers.

But maybe I am all wrong. If there is a demand for 16 bit X86 compilers
that support exception handling, RTTI, multiple inheritance, etc., I'd
certainly be pleased to work with Dinkumware to fill it.

-Walter Bright
www.digitalmars.com C, C++, D programming language compilers
Apr 25 '06 #39
P.J. Plauger wrote:
"Walter Bright" <wa****@digital mars-nospamm.com> wrote in message
news:k6******** *************** *******@comcast .com...
In today's world, a sub-megabyte program is a trivial program, and I would
agree with you. But in the 16 bit DOS days, this was not true at all. A
250K program could be extremely complex. Huh? Why does a 250KB program suddenly get less complex? I agree that
code now freely sprawls because memory is so extensive and so cheap,
but it doesn't follow that a small program now *has* to be simpler than
20 years ago.


It usually is because of the standard bloat brought in by the C++
runtime library. Once you start supporting locales, wide characters,
exceptions, etc., or linking to some other library, big chunks of code
get pulled in, and so even fairly simple programs are pretty fat
compared with programs of similar size in the DOS daze.
We obviously have a different aesthetic, since I consider the P-system
an idea whose time had come and gone before it really hit the ground.
It was before its time because its performance was so poor on the old
processors. What made the idea workable in the 90's was 100x processor
speed improvements. What sealed the deal for Java was the emergence of
the JIT (Just In Time) compiler for Java (first invented by Symantec).
(Remember Softech Microsystems?) But that's wandering afield. The point
of this respons is, there's nothing intrinsic in exceptions, templates,
or C++ in general that prohibits their use in sub-megabyte systems.
Back in the 1980s people were still fretting over the 5-15 per cent
overhead you get when writing in C instead of assembler.
Actually, the cost of writing in C vs asm was about 40% for 16 bit code.
At least for an expert asm programmer. And being the (so far) only
implementer of exceptions, multiple inheritance, and RTTI on 16 bit DOS
I *know* it works. You're just not going to be able to write a program
approaching the complexity and capability of one not using such features.

C won, mostly
(IMO) because of the much greater productivity and in part because of
the steady increase in memory size and the steady decrease in memory
cost.
And improving processor speed. The most successful 16 bit DOS apps,
however, still tended to be written in assembler. Remember how Pkware
buried ARC? All pkware was was a hand optimized assembler version of
ARC. It was common to use a mix of asm and C. 32 bit processors have
pretty much killed off the need for writing in asm anymore.

BTW, another big reason that C won was because the C compilers of the
day were much, much better than the compilers for other languages. That,
for example, buried Pascal. By the time Pascal compilers got better, it
was too late.

Now some people in the embedded world are fretting because of the
additional 10-20 per cent overhead when writing in C++ instead of C.
Memory is dirt cheap, so it's primarily architectural limitations (like
address size) that cause problems. If that overhead pushes you from a
16-bit to a 32-bit architecture, it's worth worrying about. Otherwise,
time to market trumps any piddling extra cost in storage, yes even
when you're making 10 million of 'em. Choice of programming language
is rarely black and white.


I'm not referring to the cost of storage, I'm referring to the 640K
hardwired limitation. Any overhead that adds code size takes away from
the size of the data set the program can handle. This even applies to
simple utilities, like diff.

-Walter Bright
www.digitalmars.com C, C++, D programming language compilers
Apr 25 '06 #40

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

11
6711
by: Chris Mantoulidis | last post by:
Out of curiosity, I wanted to test executable file sizes of the same program from different compilers... The 2 compilers I tested (both old, on purpose) were Borland C++ 3.1 and DJGPP 2.03... The program was nothing but a simple hello world program using the iostream header file... I compiled and linked (both copies) with those 2 different
24
2076
by: wm2004 | last post by:
Which is the best C++ Compiler? Get An Online Business and Make Money! Learn the secrets of many ordinary people who quit their day jobs to pursue an online business. There are many affiliate programs to choose from, but choose an interest you are passionate about and sell it online! Please subscribe to the newsletter below for updates and news:
9
2554
by: Jacek Dziedzic | last post by:
Hi! I often find that my programs need to store information on "current mode of something" with two or at most several mutually exclusive "modes" to choose from, e.g. - datafile: is it in a) read-only mode or b) write-only mode, - a function picking points a) above, b) below or c) contained on a plane in 3D, etc.
19
1775
by: David W | last post by:
float nanometers(long pm) { return pm / 1000.0f; } void f() { if(nanometers(309311L) == nanometers(309311L)) { // do something }
71
33270
by: David T. Ashley | last post by:
Where is the best place to define TRUE and FALSE? Are they in any of the standard include files, ever? Do any standards apply? What I've traditionally done is something like: #ifndef (TRUE) #define TRUE (1)
31
2821
by: Mark Dufour | last post by:
Hi all, I have recently released version 0.0.20 and 0.0.21 of Shed Skin, an optimizing Python-to-C++ compiler. Shed Skin allows for translation of pure (unmodified), implicitly statically typed Python programs into optimized C++, and hence, highly optimized machine language. Besides many bug fixes and optimizations, these releases add the following changes: -support for 'bisect', 'collections.deque' and 'string.maketrans'
10
2230
by: rsteph | last post by:
I've been working to immerse myself in C++ lately. I've programmed with Java, VB, and VB.NET in the past. I'm curious what compiler(s) people use and/or would suggest to me to use with C++ programming. I used a free version of NetBeans to program in Java, then Visual Studios 6.0 and Visual Studio.NET for each type of VB. I tried using Visual Studio.NET to write C++ programs but found that it kept wanting to add extra code, and that trying to...
41
18263
by: Miroslaw Makowiecki | last post by:
Where can I download Comeau compiler as a trial version? Thanks in advice.
159
7165
by: bernard | last post by:
howdy! please recommend a good c compiler. - should be small - should be fast - should come with a good ide - should be inexpensive i am using windows os.
0
9777
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
11108
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10725
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10830
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10403
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
5781
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5978
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4601
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
3226
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.