473,729 Members | 2,353 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

A Portable C Compiler

http://slashdot.org/

"The leaner, lighter, faster, and most importantly, BSD Licensed,
Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc.
The compiler is based on the original Portable C Compiler by S. C.
Johnson, written in the late 70's. Even though much of the compiler has
been rewritten, some of the basics still remain. It is currently not
bug-free, but it compiles on x86 platform, and work is being done on it
to take on GCC's job."

The PCC was the first C compiler I used and studied, back then, when
Unix and C started appearing here in France. We had a source license,
and browsing there I found the PCC code.

The discussion is here.

http://undeadly.org/cgi?action=artic...mode=expanded/

It is interesting to see the level of frustration of the BSD people
with GCC. They just want a compiler that is simple, small, and...
supports all architectures that Open BSD supports.

Will they succeed?

Of course it is easy to have a compiler that supports 3 back ends, say.
But supporting 10?

With a mixture of weird CPUs etc?

In any case PCC should be up to the task. I remember it run in the
Honeywell-Bull computers of that time (beginning of the 80s), so
it should run in many others... Running with those was really a
challenge.
Sep 17 '07 #1
42 3442
On Sep 17, 2:53 pm, jacob navia <ja...@jacob.re mcomp.frwrote:
http://slashdot.org/

"The leaner, lighter, faster, and most importantly, BSD Licensed,
Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc.
The compiler is based on the original Portable C Compiler by S. C.
Johnson, written in the late 70's. Even though much of the compiler has
been rewritten, some of the basics still remain. It is currently not
bug-free, but it compiles on x86 platform, and work is being done on it
to take on GCC's job."

The PCC was the first C compiler I used and studied, back then, when
Unix and C started appearing here in France. We had a source license,
and browsing there I found the PCC code.

The discussion is here.

http://undeadly.org/cgi?action=artic...mode=expanded/

It is interesting to see the level of frustration of the BSD people
with GCC. They just want a compiler that is simple, small, and...
supports all architectures that Open BSD supports.

Will they succeed?

Of course it is easy to have a compiler that supports 3 back ends, say.
But supporting 10?

With a mixture of weird CPUs etc?

In any case PCC should be up to the task. I remember it run in the
Honeywell-Bull computers of that time (beginning of the 80s), so
it should run in many others... Running with those was really a
challenge.
Starting with PCC and trying to compete with GCC is like starting with
a dingy and planning to race a 65' yacht.

I guess that :
http://www.tendra.org/about/

has a much better chance to succeed.

Other attempts:
http://www.thefreecountry.com/compilers/cpp.shtml

Sep 17 '07 #2
user923005 <dc*****@connx. comwrote:
On Sep 17, 2:53 pm, jacob navia <ja...@jacob.re mcomp.frwrote:
>http://slashdot.org/

"The leaner, lighter, faster, and most importantly, BSD Licensed,
Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc.
The compiler is based on the original Portable C Compiler by S. C.
Johnson, written in the late 70's. Even though much of the compiler has
been rewritten, some of the basics still remain. It is currently not
bug-free, but it compiles on x86 platform, and work is being done on it
to take on GCC's job."

The PCC was the first C compiler I used and studied, back then, when
Unix and C started appearing here in France. We had a source license,
and browsing there I found the PCC code.

The discussion is here.

http://undeadly.org/cgi?action=artic...mode=expanded/

It is interesting to see the level of frustration of the BSD people
with GCC. They just want a compiler that is simple, small, and...
supports all architectures that Open BSD supports.

Will they succeed?

Of course it is easy to have a compiler that supports 3 back ends, say.
But supporting 10?

With a mixture of weird CPUs etc?

In any case PCC should be up to the task. I remember it run in the
Honeywell-Bull computers of that time (beginning of the 80s), so
it should run in many others... Running with those was really a
challenge.

Starting with PCC and trying to compete with GCC is like starting with
a dingy and planning to race a 65' yacht.
That depends on in what manner you are trying to compete. It is true that it
seems unlikely that PCC will be able to generate as good code as GCC anytime
in the near future. On the other hand it should not be very difficult to
compete with GCC with regards to compile time and memory usage needed by the
compiler (areas in which GCC is not very good.)

For people trying to do development on older machines these features can be worth
much more than having the generated code run 0.5% faster.

Other people will have other priorities.
>
I guess that :
http://www.tendra.org/about/

has a much better chance to succeed.

Other attempts:
http://www.thefreecountry.com/compilers/cpp.shtml
--
<Insert your favourite quote here.>
Erik Trulsson
er******@studen t.uu.se
Sep 18 '07 #3
Erik Trulsson wrote:
That depends on in what manner you are trying to compete. It is true that
it seems unlikely that PCC will be able to generate as good code as GCC
anytime in the near future. Â*On the other hand it should not be very
difficult to compete with GCC with regards to compile time and memory
usage needed by the compiler (areas in which GCC is not very good.)
Properties such as compile time and memory usage are only relevant to the
compilation process, which is a very tiny part of the whole software
production process. As far as compilers go and what is expected from the
compiler, those features may be nice to have but they are very far from
being important. In fact, they are totally irrelevant.

No one in their right mind prefers a lighter compiler that produces weak or
buggy code to one which is not so light but produces strong, tight and even
secure code.

For people trying to do development on older machines these features can
be worth much more than having the generated code run 0.5% faster.
In this day and age anyone can purchase a very capable system with
multi-core processors for less than 300 euros. It is also possible to buy
used systems for almost nothing. Frankly, I don't believe that build times
are an issue anymore or have been for some time.
Other people will have other priorities.
I don't believe that any developer will ever be willing to trade quality
code for a snappier build process. Naturally it is a nice feature but there
is absolutely no way it would ever be seriously considered for any
tradeoff.
Rui Maciel
Sep 18 '07 #4
Rui Maciel wrote:
Erik Trulsson wrote:
>That depends on in what manner you are trying to compete. It is true that
it seems unlikely that PCC will be able to generate as good code as GCC
anytime in the near future. On the other hand it should not be very
difficult to compete with GCC with regards to compile time and memory
usage needed by the compiler (areas in which GCC is not very good.)

Properties such as compile time and memory usage are only relevant to the
compilation process, which is a very tiny part of the whole software
production process.

I have to disagree here.

Each time you make a change in C you have to rebuild. for many projects,
a change can affect a lot of files. Global changes that need a full
recompilation are done not VERY often, but they are done...

This means that a compiler that slows down the development process by
just 30-60 seconds per build, it is taking between 15-30 minutes per day
to each developer...

Multiply that for a team and you see that a lot of time people are just
waiting for gcc to finish. Of course, this is not visible in small
projects.
As far as compilers go and what is expected from the
compiler, those features may be nice to have but they are very far from
being important. In fact, they are totally irrelevant.
Surely not. A fast compiler allows YOU to develop faster. And that is
important. Gcc is not very fast, mind you.
No one in their right mind prefers a lighter compiler that produces weak or
buggy code to one which is not so light but produces strong, tight and even
secure code.
You are speaking here as if you had never a gcc bug...

And yes, a compiler can be slow AND buggy, just look at gcc 3.1.xx for
amd64 platform and you will see what a buggy compiler can be. The same
with the 4.0xx and 4.1 series...

A simpler compiler is surely easier to debug you see?
>
>For people trying to do development on older machines these features can
be worth much more than having the generated code run 0.5% faster.

In this day and age anyone can purchase a very capable system with
multi-core processors for less than 300 euros. It is also possible to buy
used systems for almost nothing. Frankly, I don't believe that build times
are an issue anymore or have been for some time.
For the company I am working, a full rebuild takes 10 minutes in a super
hyper fast lane dual core amd64 using MSVC. Using gcc it takes like 45
minutes...
>Other people will have other priorities.

I don't believe that any developer will ever be willing to trade quality
code for a snappier build process.
Quality of code? Gcc's code quality can be great when there is no bugs
in the optimizer... When they are, as it is sadly very often the case,
we have to use the debug version... And that code is quite bad.

We get then the worst of both worlds: slow AND buggy.
>Naturally it is a nice feature but there
is absolutely no way it would ever be seriously considered for any
tradeoff.
Since they have the monopoly under linux, there is nothing
anyone can do about that.

DISCLAIMER:
I am biased against it. I use another compiler.
Sep 18 '07 #5
Rui Maciel <ru********@gma il.comwrote:
Erik Trulsson wrote:
That depends on in what manner you are trying to compete. It is true that
it seems unlikely that PCC will be able to generate as good code as GCC
anytime in the near future. ??On the other hand it should not be very
difficult to compete with GCC with regards to compile time and memory
usage needed by the compiler (areas in which GCC is not very good.)
Properties such as compile time and memory usage are only relevant to the
compilation process, which is a very tiny part of the whole software
production process. As far as compilers go and what is expected from the
compiler, those features may be nice to have but they are very far from
being important. In fact, they are totally irrelevant.
No one in their right mind prefers a lighter compiler that produces weak or
buggy code to one which is not so light but produces strong, tight and even
secure code.
Indeed. And in fact, the OpenBSD developers have for years complained that
GCC does not produce strong, tight and secure code. In other words, they
claim that GCC slowly compiles fast, buggy code.

Sep 18 '07 #6
William Ahern wrote, On 18/09/07 16:06:
Rui Maciel <ru********@gma il.comwrote:
>Erik Trulsson wrote:
>>That depends on in what manner you are trying to compete. It is true that
it seems unlikely that PCC will be able to generate as good code as GCC
anytime in the near future. ??On the other hand it should not be very
difficult to compete with GCC with regards to compile time and memory
usage needed by the compiler (areas in which GCC is not very good.)
>Properties such as compile time and memory usage are only relevant to the
compilation process, which is a very tiny part of the whole software
production process. As far as compilers go and what is expected from the
compiler, those features may be nice to have but they are very far from
being important. In fact, they are totally irrelevant.
>No one in their right mind prefers a lighter compiler that produces weak or
buggy code to one which is not so light but produces strong, tight and even
secure code.

Indeed. And in fact, the OpenBSD developers have for years complained that
GCC does not produce strong, tight and secure code. In other words, they
claim that GCC slowly compiles fast, buggy code.
Not secure is not the same thing as buggy. If you want secure code you
want the code to do something safe on buffer overflows, for example, but
as far as the C standard is concerned whatever the code does on a buffer
overflow it is not a bug in the compiler. By strong and tight they could
also mean things which are nothing to do with whether gcc incorrectly
translates code.
--
Flash Gordon
Sep 18 '07 #7
jacob navia wrote, On 18/09/07 14:34:
Rui Maciel wrote:
>Erik Trulsson wrote:
>>That depends on in what manner you are trying to compete. It is true
that
it seems unlikely that PCC will be able to generate as good code as GCC
anytime in the near future. On the other hand it should not be very
difficult to compete with GCC with regards to compile time and memory
usage needed by the compiler (areas in which GCC is not very good.)

Properties such as compile time and memory usage are only relevant to the
compilation process, which is a very tiny part of the whole software
production process.

I have to disagree here.

Each time you make a change in C you have to rebuild. for many projects,
a change can affect a lot of files. Global changes that need a full
recompilation are done not VERY often, but they are done...

This means that a compiler that slows down the development process by
just 30-60 seconds per build, it is taking between 15-30 minutes per day
to each developer...

Multiply that for a team and you see that a lot of time people are just
waiting for gcc to finish. Of course, this is not visible in small
projects.
That is not long. Wait until you work on a project where a build take 8
hours!
>As far as compilers go and what is expected from the
compiler, those features may be nice to have but they are very far from
being important. In fact, they are totally irrelevant.

Surely not. A fast compiler allows YOU to develop faster. And that is
important. Gcc is not very fast, mind you.
Yes, I agree a fast compiler is useful, and gcc is not the fastest around.
>No one in their right mind prefers a lighter compiler that produces
weak or
buggy code to one which is not so light but produces strong, tight and
even
secure code.

You are speaking here as if you had never a gcc bug...

And yes, a compiler can be slow AND buggy, just look at gcc 3.1.xx for
amd64 platform and you will see what a buggy compiler can be. The same
with the 4.0xx and 4.1 series...

A simpler compiler is surely easier to debug you see?
Personally I've hit very few bugs in gcc. They do exist but I don't hit
them often enough to worry about.
>>For people trying to do development on older machines these features can
be worth much more than having the generated code run 0.5% faster.

In this day and age anyone can purchase a very capable system with
multi-core processors for less than 300 euros. It is also possible to buy
used systems for almost nothing. Frankly, I don't believe that build
times
are an issue anymore or have been for some time.

For the company I am working, a full rebuild takes 10 minutes in a super
hyper fast lane dual core amd64 using MSVC. Using gcc it takes like 45
minutes...
A dual core amd64 is *not* super hyper fast. Anyway, make sure you have
make configured to do multiple compilations at once. On Linux it is
often recommended that you set make to compile two files per core at a
time, so on a dual core machine you should be compiling 4 files in parallel.
>>Other people will have other priorities.

I don't believe that any developer will ever be willing to trade quality
code for a snappier build process.

Quality of code? Gcc's code quality can be great when there is no bugs
in the optimizer... When they are, as it is sadly very often the case,
we have to use the debug version... And that code is quite bad.
At -O2 I've *very* rarely hit problems.
We get then the worst of both worlds: slow AND buggy.
>>Naturally it is a nice feature but there
is absolutely no way it would ever be seriously considered for any
tradeoff.

Since they have the monopoly under linux, there is nothing
anyone can do about that.
No gcc does not have a monopoly under Linux. There is tcc, although that
is still flagged as experimental on Ubuntu, Tendra and of course Intel's
icc.
DISCLAIMER:
I am biased against it. I use another compiler.
I use gcc a *lot* under Linux, and historically I have used it a fair
bit under SCO and AIX with some use under Cygwin and MinGW as well and
have not hit the level of bugs you claim for it.
--
Flash Gordon
Sep 18 '07 #8
Erik Trulsson wrote:
That depends on in what manner you are trying to compete. It is true that
it seems unlikely that PCC will be able to generate as good code as GCC
anytime in the near future. Â*On the other hand it should not be very
difficult to compete with GCC with regards to compile time and memory
usage needed by the compiler (areas in which GCC is not very good.)
Properties such as compile time and memory usage are only relevant to the
compilation process, which is a very tiny part of the whole software
production process. As far as compilers go and what is expected from the
compiler, those features may be nice to have but they are very far from
being important. In fact, they are totally irrelevant.

No one in their right mind prefers a lighter compiler that produces weak or
buggy code to one which is not so light but produces strong, tight and even
secure code.

For people trying to do development on older machines these features can
be worth much more than having the generated code run 0.5% faster.
In this day and age anyone can purchase a very capable system with
multi-core processors for less than 300 euros. It is also possible to buy
used systems for almost nothing. Frankly, I don't believe that build times
are an issue anymore or have been for some time.
Other people will have other priorities.
I don't believe that any developer will ever be willing to trade quality
code for a snappier build process. Naturally it is a nice feature but there
is absolutely no way it would ever be seriously considered for any
tradeoff.
Rui Maciel
Sep 18 '07 #9
Rui Maciel <ru********@gma il.comwrote:
<snip>
Other people will have other priorities.

I don't believe that any developer will ever be willing to trade quality
code for a snappier build process. Naturally it is a nice feature but there
is absolutely no way it would ever be seriously considered for any
tradeoff.
And yet, it is being seriously considered by OpenBSD and NetBSD. The OpenBSD
folks have specifically stated that they would prefer a faster build to
faster code.

Your overly broad and loaded "quality" argument serves only to muddy the
waters. GCC's output isn't shinier than any other output.
Sep 18 '07 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

14
2625
by: John Smith | last post by:
Hi I'm looking for a header-file only STL implementation. I know the vendors provide one with compilers but often they are either buggy or has some other problems. My problem is that under Windows the different versions of the Microsoft compiler are not compatible with each other. This means you can't compile file with version X and link with version Y because you end up with linker errors.
5
4811
by: Mark Shelor | last post by:
Problem: find a portable way to determine whether a compiler supports the "long long" type of C99. I thought I had this one solved with the following code: #include <limits.h> #ifdef ULONG_LONG_MAX
5
2391
by: Kobu | last post by:
In embedded systems (programmed in C), often times structure declarations are used to group together several status/control/data registers of external hardware (or even internal registers). The example below (from GBD's THE C BOOK) uses this. Is this a portable method? . /*
8
7074
by: suresh_C# | last post by:
Dear All, What is difference between Portable Executable (PE) file and a Assembly? Thanks, Mahesh
131
6178
by: pemo | last post by:
Is C really portable? And, apologies, but this is possibly a little OT? In c.l.c we often see 'not portable' comments, but I wonder just how portable C apps really are. I don't write portable C code - *not only* because, in a 'C sense', I
5
7625
by: Richard Giuly | last post by:
Hello, I would like to write "portable" C++ code that could theoretically run on linux, windows, and other platforms, and I'd like to use VS as the editor/compiler/linker. The simplest thing that seems to complie in VS is this: //VS example #include "stdafx.h"
20
21767
by: rkk | last post by:
Hi, Is there an equivalent typeof macro/method to determine the type of a variable in runtime & most importantly that works well with most known C compilers? gcc compiler supports typeof() macro, but the same code is not getting compiled in solaris forte compiler and in microsoft VS 2003 compiler. I tried something like below:
32
2035
by: r.z. | last post by:
class vector3 { public: union { float data; struct { float x, y, z; };
23
3337
by: asit | last post by:
what is the difference between portable C, posix C and windows C ???
0
8761
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9281
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
9200
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9142
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8148
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6022
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4795
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3238
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2163
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.