473,796 Members | 2,585 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

C both slow and memory-hungry for embedded systems?


When I want to store a number, I use "unsigned". I go with unsigned
because it's the natural type for the system, and so should be the
fastest.

However, there are 8-Bit microcontroller s out there that do 8-Bit
arithmetic faster than 16-Bit arithmetic, and so on these systems char
is faster than int.

Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?

Martin

Oct 16 '07 #1
14 1898
On 16 okt, 11:58, Martin Wells <war...@eircom. netwrote:
When I want to store a number, I use "unsigned". I go with unsigned
because it's the natural type for the system, and so should be the
fastest.

However, there are 8-Bit microcontroller s out there that do 8-Bit
arithmetic faster than 16-Bit arithmetic, and so on these systems char
is faster than int.

Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?
Not in my experience.
There are an increasing number of embedded systems that use processors
whose optimal datatype is 16 bits or wider.
And when memory or performance is really at a premium, software
engineers are likely to use whatever works best. This typically means
making assumptions about the target platform.
>
Martin
Bart v Ingen Schenau

Oct 16 '07 #2
On Oct 16, 10:58 am, Martin Wells <war...@eircom. netwrote:
When I want to store a number, I use "unsigned". I go with unsigned
because it's the natural type for the system, and so should be the
fastest.

However, there are 8-Bit microcontroller s out there that do 8-Bit
arithmetic faster than 16-Bit arithmetic, and so on these systems char
is faster than int.

Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?
What's stopping you from using (usnigned) char on such systems ?

Oct 16 '07 #3
Spiros Bousbouras wrote:
On Oct 16, 10:58 am, Martin Wells <war...@eircom. netwrote:
>When I want to store a number, I use "unsigned". I go with unsigned
because it's the natural type for the system, and so should be the
fastest.

However, there are 8-Bit microcontroller s out there that do 8-Bit
arithmetic faster than 16-Bit arithmetic, and so on these systems char
is faster than int.

Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?

What's stopping you from using (usnigned) char on such systems ?
Unsigned char is indeed a common data type for embedded programming on
small processors.

--
Thad
Oct 16 '07 #4
On Tue, 16 Oct 2007 02:58:53 -0700, Martin Wells <wa****@eircom. net>
wrote in comp.lang.c:
>
When I want to store a number, I use "unsigned". I go with unsigned
because it's the natural type for the system, and so should be the
fastest.
I seriously doubt that. I don't know of any current, or even 20 year
old embedded architecture where unsigned int is any more natural, or
any faster, than signed int.
However, there are 8-Bit microcontroller s out there that do 8-Bit
arithmetic faster than 16-Bit arithmetic, and so on these systems char
is faster than int.

Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?

Martin
In the first place, comp.arch.embed ded would be a better place to
discuss this.

In the second place, and this is one of the reasons why comp.lang.c is
not a good place to discuss this, is that a whole lot of C compilers
for embedded architectures are not really conforming C
implementations . This is true for many of the 16-bit processors, not
just the 8-bit ones.

Many such implementations for 8-bitters especially offer the option to
do arithmetic and logical instructions on signed and unsigned 8-bit
values without extending them to int, jut for example.

--
Jack Klein
Home: http://JK-Technology.Com
FAQs for
comp.lang.c http://c-faq.com/
comp.lang.c++ http://www.parashift.com/c++-faq-lite/
alt.comp.lang.l earn.c-c++
http://www.club.cc.cmu.edu/~ajo/docs/FAQ-acllc.html
Oct 16 '07 #5
Spiros:
Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?

What's stopping you from using (unsigned) char on such systems ?

I write fully-portably in C89, paying no attention to the particulars
of the platform. If I was to start using char instead of int, I'd
introduce inefficiency on systems whose optimal int type is >= 16
bits.

I think a fair few embedded programmers are starting to use things
like int_fastest_at_ least_8 which are defined in C99.

To be a fully-portable programmer both for PC's and for embedded
systems, should we start using these <stdint.htype s?

Martin

Oct 18 '07 #6
Martin Wells said:

<snip>
I think a fair few embedded programmers are starting to use things
like int_fastest_at_ least_8 which are defined in C99.

To be a fully-portable programmer both for PC's and for embedded
systems, should we start using these <stdint.htype s?
Given the limited availability of conforming C99 implementations , a *fully*
portable program cannot assume the existence of <stdint.hor the C99
types defined therein. So, at the very least, you should be prepared to
supply your own definitions of those types if you can (portably) determine
that they are not provided by the implementation.

Personally, I don't bother - I find the types in C90 to be perfectly
adequate to my needs - but it's something to consider if your view isn't
quite as... um... radical as mine.

--
Richard Heathfield <http://www.cpax.org.uk >
Email: -http://www. +rjh@
Google users: <http://www.cpax.org.uk/prg/writings/googly.php>
"Usenet is a strange place" - dmr 29 July 1999
Oct 18 '07 #7
On Oct 18, 10:21 am, Martin Wells <war...@eircom. netwrote:
Spiros:
Standarising C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?
What's stopping you from using (unsigned) char on such systems ?

I write fully-portably in C89, paying no attention to the particulars
of the platform. If I was to start using char instead of int, I'd
introduce inefficiency on systems whose optimal int type is >= 16
bits.
Not necessarily. It is entirely possible that a compiler will
represent internally a char using whichever integer type is the
fastest in the platform.
I think a fair few embedded programmers are starting to use things
like int_fastest_at_ least_8 which are defined in C99.

To be a fully-portable programmer both for PC's and for embedded
systems, should we start using these <stdint.htype s?
If you want to be fully portable *and* the fastest possible *and* pay
no attention to the particulars of the platform then I guess you
would have to use int_fast8_t. If on the other hand you are willing
to pay just a bit of attention to the particulars of the platform
then you could do something like
typedef char my_int_fast_8_t
and replace char in the line above by whatever type is the fastest
in each platform.
Oct 18 '07 #8
Martin Wells wrote:
Spiros:
>>Standarisin g C in such a way that int is at least 16-Bit, has this
made C both slow and memory-hungry for embedded systems programming?
What's stopping you from using (unsigned) char on such systems ?


I write fully-portably in C89, paying no attention to the particulars
of the platform. If I was to start using char instead of int, I'd
introduce inefficiency on systems whose optimal int type is >= 16
bits.

I think a fair few embedded programmers are starting to use things
like int_fastest_at_ least_8 which are defined in C99.
Using int_fast8_t isn't sufficient; you also have to put the compiler
into a mode which is nonconforming either because it disables automatic
conversion to 'int' in the many contexts where that conversion is
required, or because 'int' is an 8 bit type.

The other problem, of course, is the number of C standard library
routines which take 'int' arguments and return 'int' values. However,
there's an easy workaround for that: create alternative functions that
take 8-bit arguments, where appropriate.
Oct 18 '07 #9
Martin Wells wrote:
>
.... snip ...
>
I think a fair few embedded programmers are starting to use things
like int_fastest_at_ least_8 which are defined in C99.

To be a fully-portable programmer both for PC's and for embedded
systems, should we start using these <stdint.htype s?
No. They (and we) should avoid them. They are not portable,
because they are not universally available (as are byte, int, long)
and are also a C99 construct. Note that even a C99 system will not
necessarily make those types available, because they are hardware
dependant.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home .att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Oct 19 '07 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
2923
by: Philipp K. Janert, Ph.D. | last post by:
Dear All! I am trying to load a relatively large table (about 1 Million rows) into an sqlite table, which is kept in memory. The load process is very slow - on the order of 15 minutes or so. I am accessing sqlite from Python, using the pysqlite driver. I am loading all records first using cx.execute( "insert ..." ). Only once I have run cx.execute() for all records, I commit all
20
1697
by: Charles Herman | last post by:
I have the following two programs: PROG I void main() { int *v = new int; for (int i = 0; i < NUM; ++i) v = i; }
1
1682
by: Gary Wales | last post by:
We have two main query types running against a table of some 2 million rows and have gotten query response down to well under a second by using the right indexes. Problem is that we are running an advertising campaign that brings a concentrated block of users to the site. When this happens one of the queries which relies on a particluar index comes severely of the rails and can take up to 2 minutes filling the slow query log for 15 to 20...
2
4618
by: Steve K | last post by:
Hi, I'm on a Mac, OSX version 10.3.5, 1.25GB SDRAM, 400 MHz PowerPC G4 I have DreamweaverMX 2004, version 7.0.1. It is painfully slow!! It's very slow working in design view, and almost just as slow working in code view. I'm always in split view so I can see what I've done in the code. So I click into the design view and it takes almost 5-8 seconds for the changes I just made in the code to "happen."
7
1723
by: Charles | last post by:
I have Visual Studio 6.0 stand-alone (with C++) and as part of Compaq Visual Fortran 6.6c. I am moving my development from a Win 98 SE machine w/2x 10 Gb HDD (each w/ 1Gb free), 768 Mb RAM to a 3.2 MHz Win XP Pro machine w/147 Gb HDD (121 Gb free) and 2 Gb RAM. Loading a particular project with Fortran code only with about 80 files and 750 Mb of source code on Win 98 takes 15 seconds until the editor is ready to use.
5
2526
by: Tom | last post by:
Any ideas what can cause ASP.NET pages to be slow now and then? It can take more than 5 seconds to get the page, which is way to much. One of the pages (not sure how many there are, i have just noticed it on the "home" page) are very simple (only HTML, no database or anything involved). I have made sure that i have cached the page: <%@ OutputCache Duration="7200" VaryByParam="none"%> I don't think i have ever seen the slowness when i...
46
13149
by: dunleav1 | last post by:
I have a process that does inserts that runs 50% slower that Oracle and Mssql. Queries normally take 50% slower than normal. DB2, Oracle, Mssql all are configured on same os, same disk array (raid 1/0 - 15k disks) but DB2 I/O seems to be significantly slower. Tablespaces are SMS with automatic prefetch set. My thought was to create the log buffer array as large as possible so I don't have to switch to disk much. I'm running circular...
2
1339
by: Michael Jackson | last post by:
I've come over to .NET 2.0 recently from 1.1 and I now find that setting a breakpoint in code and running in debug mode is extremely slow. So slow it's unusable as a debugging tool. Any suggestions? Settings? Michael
7
2108
by: colin | last post by:
Hi, Ive written a 3dmodel editor, and it works fairly well it harldy uses any cpu exept when its loading a texture from bitmap in the debugger, at all other times it hardly uses any cpu but from the debugger even in release mode it takes a second or so to load a 65k texture, with a 2ghz pc this is 10k instructions per pixel ! I cant understand what its doing here, can anyone shed any light on this ? I have the d3d debug turned off...
2
9845
by: existential.philosophy | last post by:
This is a new problem for me: I have some queries that open very slowly in design view. My benchmark query takes about 20 minutes to open in design view. That same query takes about 20 minutes to open in datasheet view. As an experiment, I deleted all rows in all tables; after that, the query took only seconds to open in both design view and datasheet view. From these facts, I conclude that Access is evaluating the query when I go to...
0
9525
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10452
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10221
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
10003
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
1
7546
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6785
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5440
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5569
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
3730
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.