By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,837 Members | 1,813 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,837 IT Pros & Developers. It's quick & easy.

When to use automatic variables and when to use malloc

P: n/a
I was reading the code of FFmpeg and it seems that they use malloc
just too much. The problems and dangers of malloc are widely known.
Malloc also has some overhead (although I don't know what is the
overhead of automatic variable sized arrays, I suspect it is smaller
than that of malloc), although I'm not too worried about it.

I was thinking that, with C99's variable length arrays, malloc
shouldn't be needed most of the time. But I'm not sure if it is
appropriate to declare everything as automatic variables, specially
for huge variables.

1-) When is it appropriate to use automatic variables?*

2-) If the problem is size, then what is the threshold?**

*I already know that we should be aware of the duration of the
variable, such as not to use an automatic variable as a buffer inside
a function, and then return the buffer. By the time the function
returns, the buffer is gone.

** Of course, I know this varies. But, given a reasonable system (say,
my Athlon XP 2600+ with 512 MB of RAM) and a reasonable OS (such as
GNU/Linux, and let's also include MS Windows because it is common),
can you give me an order of magnitude?

PS: I have googled first, and searched within this newsgroup before,
and I only found one previous discussion. It missed the point. People
kept arguing that automatic variables are fixed size, which is not
true anymore. People also kept arguing that malloc tells you when
there is not enough memory (by returning NULL) and allows you to do
something. I don't care much about it, since in the vast majority of
cases I just close the program when malloc fails. I have even written
a wrapper, so I don't have to check the pointer:

inline void *smalloc(size_t size){
void * const ptr=malloc(size);
if (ptr==NULL){
fputs("Function ",stderr);
fputs(__func__,stderr);
fputs(" called malloc and malloc returned NULL.\n",stderr);
perror(NULL);
exit(ENOMEM); // exit ou abort?
} else {
return ptr;
}
}

By the way, in the above case should I have used abort() or is exit()
ok? From the respective manual pages, I can't tell.

Feb 20 '07
Share this Question
Share on Google+
58 Replies


P: n/a
we******@gmail.com writes:
[...]
Your way works for discourse with people who are your fans or
something of that nature, while the objective way is more suited for
this special class of people called "rational" who I don't have to
have ever met before (or even agree with, on very much) to be able to
converse with. Of course I do give up the opportunity to talk to the
"irrationalists", but I guess that's just the price I have to pay.
You give up the opportunity to talk to a lot of people, many of whom
are quite rational. It's the price you pay for being rude.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
We must do something. This is something. Therefore, we must do this.
Feb 21 '07 #51

P: n/a
On Wed, 2007-02-21 at 08:36 -0800, santosh wrote:
IMHO, introducing C as the second programming language is good. Since
C doesn't automise many things that some of the other languages do, it
provides an excellent opportunity to learn careful programming.
I think that C should be a /first/ language, and others taught on top of
that with the explanation that they do feature <xbehind the scenes or
automatically. Learning high-level languages without any idea of what's
happening is, IMHO, a recipe for security mistakes. How can a student
protect against buffer overflows if he's never heard of "array
boundaries"?

Learning C after the fact wouldn't have the same effect, I think.

--
Andrew Poelstra <http://www.wpsoftware.net>
For email, use 'apoelstra' at the above site.
"You're only smart on the outside." -anon.

Feb 22 '07 #52

P: n/a
On Feb 21, 12:09 pm, Keith Thompson <k...@mib.orgwrote:
websn...@gmail.com writes:

[...]
Your way works for discourse with people who are your fans or
something of that nature, while the objective way is more suited for
this special class of people called "rational" who I don't have to
have ever met before (or even agree with, on very much) to be able to
converse with. Of course I do give up the opportunity to talk to the
"irrationalists", but I guess that's just the price I have to pay.

You give up the opportunity to talk to a lot of people, many of whom
are quite rational. It's the price you pay for being rude.
The real tragedy here is that he is interesting and intelligent, but
even more curt than Dan Pop. I still read what he has to say, but I
guess that he will get himself killfiled by a lot of readers. If a
whinging twit with no brains is brusque and crusty, then I killfile
them or ignore them or whatever. But there are clearly exceptions who
are still worth reading.

On the other hand, we could all learn a lesson from posters like
Tanmoy Bhattacharya or Chris Torek who are never, ever rude (even when
someone is screaming for a clue-by-four upside the head).

IMO-YMMV.

Feb 22 '07 #53

P: n/a
Andrew Poelstra wrote:
On Wed, 2007-02-21 at 08:36 -0800, santosh wrote:
>IMHO, introducing C as the second programming language is good.
Since C doesn't automise many things that some of the other
languages do, it provides an excellent opportunity to learn
careful programming.

I think that C should be a /first/ language, and others taught on
top of that with the explanation that they do feature <xbehind
the scenes or automatically. Learning high-level languages without
any idea of what's happening is, IMHO, a recipe for security
mistakes. How can a student protect against buffer overflows if
he's never heard of "array boundaries"?

Learning C after the fact wouldn't have the same effect, I think.
No, Pascal should be the first language. That will teach
organization without the confusion of Cs arcane symbology. The
later addition of C to the repetoire is quite easy. The student
may need to rework his i/o semantics, or write C functions to
implement Pascal standard procedures.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>
Feb 22 '07 #54

P: n/a
Keith Thompson wrote:
we******@gmail.com writes:
[...]
> Your way works for discourse with people who are your fans or
something of that nature, while the objective way is more suited for
this special class of people called "rational" who I don't have to
have ever met before (or even agree with, on very much) to be able to
converse with. Of course I do give up the opportunity to talk to the
"irrationalists", but I guess that's just the price I have to pay.

You give up the opportunity to talk to a lot of people, many of whom
are quite rational. It's the price you pay for being rude.
s/rude/boorish/

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>
Feb 22 '07 #55

P: n/a
CBFalconer wrote:
Keith Thompson wrote:
we******@gmail.com writes:

You give up the opportunity to talk to a lot of people, many of whom
are quite rational. It's the price you pay for being rude.

s/rude/boorish/
And that's the important distinction. Most of us have occasion to be
rude (excepting Chris Torek, perhaps). Paul's consistent ill-mannered
approach to discourse landed him in my killfile long ago.

Whether I miss out on some pearl of wisdom or not doesn't keep me up at
night. Besides, if he actually says something interesting, someone else
will likely reply and quote it.

Brian
Feb 22 '07 #56

P: n/a
On Wed, 21 Feb 2007 20:10:59 -0600, CBFalconer wrote
(in article <45***************@yahoo.com>):
Andrew Poelstra wrote:
>On Wed, 2007-02-21 at 08:36 -0800, santosh wrote:
>>IMHO, introducing C as the second programming language is good.
Since C doesn't automise many things that some of the other
languages do, it provides an excellent opportunity to learn
careful programming.

I think that C should be a /first/ language, and others taught on
top of that with the explanation that they do feature <xbehind
the scenes or automatically. Learning high-level languages without
any idea of what's happening is, IMHO, a recipe for security
mistakes. How can a student protect against buffer overflows if
he's never heard of "array boundaries"?

Learning C after the fact wouldn't have the same effect, I think.

No, Pascal should be the first language.
I don't disagree, but in reality, the important thing is the quality of
the instructor and the curriculum, regardless of the language. From
the "homework" assignments seen here regularly, I'd say that quality
versions of them are in very short supply.
That will teach organization without the confusion of Cs arcane symbology.
The later addition of C to the repetoire is quite easy. The student
may need to rework his i/o semantics, or write C functions to
implement Pascal standard procedures.
It's also quite important to learn low-level system architecture issues
as well, once part of degree plans, now extinct.
--
Randy Howard (2reply remove FOOBAR)
"The power of accurate observation is called cynicism by those
who have not got it." - George Bernard Shaw

Feb 26 '07 #57

P: n/a
Unfortunately, there is no elegant way to catch VLA allocation errors
and there is no way to reliably expect or detect them either.
You could do the old 'stack address subtraction' trick, but you can't
really {portably} know for sure how much stack is available.
On my system, using huge automatic variables gives me a message of
Segmentation fault. However, when I write a handler for SIGSEGV, the
handler is ignored and the program just crashes. But if I modify the
program to read a NULL pointer, the handler is called.

What is happening? Can't segmentation faults caused by stack overflow
be caught?

Mar 1 '07 #58

P: n/a
Jorge Peixoto wrote On 03/01/07 08:51,:
>>Unfortunately, there is no elegant way to catch VLA allocation errors
and there is no way to reliably expect or detect them either.
You could do the old 'stack address subtraction' trick, but you can't
really {portably} know for sure how much stack is available.


On my system, using huge automatic variables gives me a message of
Segmentation fault. However, when I write a handler for SIGSEGV, the
handler is ignored and the program just crashes. But if I modify the
program to read a NULL pointer, the handler is called.

What is happening? Can't segmentation faults caused by stack overflow
be caught?
The C language itself makes no promises about what can
and can't be done in such situations. Your system might make
additional promises. Check the system's documentation.

<off-topic>

One thing you might find in the system's documentation is
that signal handlers might not be called if there's no stack
space remaining to call them with ... You might also find
that you can prepare for this problem by using an "alternate
signal stack" or something of the kind.

</off-topic>

.... but I emphasize: all such might-be's are properties of
the system you happen to be running C on, not of C.

--
Er*********@sun.com
Mar 1 '07 #59

58 Replies

This discussion thread is closed

Replies have been disabled for this discussion.