468,463 Members | 2,086 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,463 developers. It's quick & easy.

Revistiing using return value as reference

I remember there was a thread a while back that was talking about using the
return value of a function as a reference where I had thought the reference
would become invalidated because it was a temporary but it was stated that
it would not. This has come up in an irc channel but I can not find the
original thread, nor can I get any code to work.

Foo& Bar( int Val )
{
return Foo( Val );
}

Will not work, can not convert Foo to Foo&

Foo Bar( int Val )
{
return Foo( Val );
}

int main()
{
Foo& Inst = Bar( 10 );
}

Does not work, same thing, can not convert Foo to Foo&.

Foo& Bar(int Val )
{
Foo Temp( Val );
return Foo( Val );
}

int main()
{
Foo& Inst = Bar( 10 );
}

Does not work, checking the value of Val_ later shows garbage data
indicating the reference has become invalidated (as expected). Can anyone
remember the case where a temp value returned from a function can be used as
a reference and not invalidated immediately?

--
Jim Langston
ta*******@rocketmail.com
Dec 25 '07
68 3312
johanatan wrote:
>
The name pointer was taken directly from C, and the new construct
was the first time a new name had to be given to that concept in
C++ (at least we don't have 3 types of references, pointers,
aliases, do we?). It makes most sense to go with 'purest CS' and
the added C connotations are just a bonus since C++ programmers
should already be well familiar with C and assembly.

That's my 'constructed' historical view. It that's someone
inaccurate, please do shed light.
Yes, you are inventing history here.

The names 'class' and 'reference' are taken from the Simula language,
which was around well before C was invented. Like many scandinavians,
Bjarne used that language for his under graduate CS courses.
http://www.research.att.com/~bs/bs_faq.html#why
Bo Persson
Jan 12 '08 #51
On Jan 12, 8:59*am, "Bo Persson" <b...@gmb.dkwrote:
johanatan wrote:
The name pointer was taken directly from C, and the new construct
was the first time a new name had to be given to that concept in
C++ (at least we don't have 3 types of references, pointers,
aliases, do we?). *It makes most sense to go with 'purest CS' and
the added C connotations are just a bonus since C++ programmers
should already be well familiar with C and assembly.
That's my 'constructed' historical view. *It that's someone
inaccurate, please do shed light.

Yes, you are inventing history here.

The names 'class' and 'reference' are taken from the Simula language,
which was around well before C was invented. Like many scandinavians,
Bjarne used that language for his under graduate CS courses.

http://www.research.att.com/~bs/bs_faq.html#why

Bo Persson
I'd suggest you read some philosophy, set theory, mathematics and
such. The names 'class' and 'reference' most surely come from those
'purely' intellectual pursuits (aside from any engineering paradigm
including but not limited to any specific computer language).

--Jonathan
Jan 12 '08 #52
johanatan wrote:
On Jan 12, 8:59*am, "Bo Persson" <b...@gmb.dkwrote:
>johanatan wrote:
The name pointer was taken directly from C, and the new construct
was the first time a new name had to be given to that concept in
C++ (at least we don't have 3 types of references, pointers,
aliases, do we?). *It makes most sense to go with 'purest CS' and
the added C connotations are just a bonus since C++ programmers
should already be well familiar with C and assembly.
That's my 'constructed' historical view. *It that's someone
inaccurate, please do shed light.

Yes, you are inventing history here.

The names 'class' and 'reference' are taken from the Simula language,
which was around well before C was invented. Like many scandinavians,
Bjarne used that language for his under graduate CS courses.

http://www.research.att.com/~bs/bs_faq.html#why

Bo Persson

I'd suggest you read some philosophy, set theory, mathematics and
such. The names 'class' and 'reference' most surely come from those
'purely' intellectual pursuits (aside from any engineering paradigm
including but not limited to any specific computer language).
You made a very specific conjecture on how those terms entered C++. That
conjecture was disputed. Now you modify your statement (essentially just by
making it vague) so that you only claim the names "came from" philosophy,
set theory, mathematics and such. Well, maybe (although, I would think that
those terms already existed before philosophers, mathematicians and such
picked them up); but that does not account for how they ended up being used
in the standard for C++.

Besides, could you remind me what the intellectual merit of this debate is
supposed to be? It appears that you are not interested in the actual
history (chain of events, considerations, and changes) of the C++ language.
Instead you are developing a "'constructed' historical view". I have to
wonder what the point of such a view is. Apparently, it is a type of
historical account that does not need to be changed even if actual evidence
is presented that it does not tell what actually happened. Now, that need
not make such a fake history useless. It could be some way of explaining
the meanings and inter-relations of certain concepts by supplying a
narrative (I guess Hegel could be read as, in part, doing something like
that). However, in this particular case, I fail to see the point.
Best

Kai-Uwe Bux
Jan 12 '08 #53
On Jan 12, 8:59*am, "Bo Persson" <b...@gmb.dkwrote:
johanatan wrote:
The name pointer was taken directly from C, and the new construct
was the first time a new name had to be given to that concept in
C++ (at least we don't have 3 types of references, pointers,
aliases, do we?). *It makes most sense to go with 'purest CS' and
the added C connotations are just a bonus since C++ programmers
should already be well familiar with C and assembly.
That's my 'constructed' historical view. *It that's someone
inaccurate, please do shed light.

Yes, you are inventing history here.

The names 'class' and 'reference' are taken from the Simula language,
which was around well before C was invented. Like many scandinavians,
Bjarne used that language for his under graduate CS courses.

http://www.research.att.com/~bs/bs_faq.html#why
I'm not denying influences of other O-O languages (we could probably
put Algol, Simula, and Smalltalk among others in this list). The
connection I was drawing between C++ references and C's concept of
'call-by-reference' (let alone asm's) was merely provided as support
for the mental image I have of references as pointers. At the very
least, I think I've provided as much rationale for thinking of
references as pointers as anyone has for 'aliases'. (Please see the
wiki page on 'references' if you do not agree with anything I'm saying
about them and also note that the word 'alias' was never mentioned
even once on that page although the obvious understanding of
references as pts 'under the hood' was put forth on multiple
occasions). The alias supporters out there have quite an uphill
battle in front of them it seems (I'm surely not the only one who
'thinks' this way. And who's the thought police anyway?)

Furthermore, if we're going to get really technical about this, a
pointer is a type of reference. The word 'reference' is the most
general word (of a list of many) in common usage to describe the
concept. So, reference swallows pointer, and alias, and moniker, and
handle, and so forth.

And, the concept of 'class' ultimately came from set theory so if C++
didn't directly take the idea from 'pure CS' (or mathematics), then it
did so indirectly as Algol, Smalltalk, Simula and others (whatever the
first O-O languages were) took it from the math literature.

--Jonathan
Jan 12 '08 #54
On Jan 12, 12:51*pm, jkherci...@gmx.net wrote:
You made a very specific conjecture on how those terms entered C++. That
conjecture was disputed.
Well, the conjecture was two-pronged from the outset. I said that it
likely was chosen because that's the pure term used in mathematics,
philosophy, and 'pure CS' literature. By 'pure CS' I mean, absent of
any specific technological influence (or as much absent as can be
expected). And, it was just a bonus that it happened to coincide with
the notion of 'call-by-reference' in asm and C.
Now you modify your statement (essentially just by
making it vague) so that you only claim the names "came from" philosophy,
set theory, mathematics and such. Well, maybe (although, I would think that
those terms already existed before philosophers, mathematicians and such
picked them up); but that does not account for how they ended up being used
in the standard for C++.
Yes, I agree. There's definitely value to understanding Algol, Simula
and others' influence, but that's a bit too 'micro' for my needs.
It's good to know, but somewhat beside the point.
Besides, could you remind me what the intellectual merit of this debate is
supposed to be? It appears that you are not interested in the actual
history (chain of events, considerations, and changes) of the C++ language..
Instead you are developing a "'constructed' historical view". I have to
wonder what the point of such a view is. Apparently, it is a type of
historical account that does not need to be changed even if actual evidence
is presented that it does not tell what actually happened. Now, that need
not make such a fake history useless. It could be some way of explaining
the meanings and inter-relations of certain concepts by supplying a
narrative (I guess Hegel could be read as, in part, doing something like
that). However, in this particular case, I fail to see the point.
The point was simply provided as justification for the mental model I
have of reference as pointers 'under the hood'. The specific direct
influence(s) for the concept or term 'reference' in C++ just isn't
that important to me. And, I still fail to see how our two
'histories' are not compatible-- it seems that the Algo and Simula
points are just adding more detail to an already sufficient (at least
to me) narrative.

--Jonathan
Jan 12 '08 #55
On 2008-01-12 21:55, johanatan wrote:
On Jan 12, 8:59 am, "Bo Persson" <b...@gmb.dkwrote:
>johanatan wrote:
The name pointer was taken directly from C, and the new construct
was the first time a new name had to be given to that concept in
C++ (at least we don't have 3 types of references, pointers,
aliases, do we?). It makes most sense to go with 'purest CS' and
the added C connotations are just a bonus since C++ programmers
should already be well familiar with C and assembly.
That's my 'constructed' historical view. It that's someone
inaccurate, please do shed light.

Yes, you are inventing history here.

The names 'class' and 'reference' are taken from the Simula language,
which was around well before C was invented. Like many scandinavians,
Bjarne used that language for his under graduate CS courses.

http://www.research.att.com/~bs/bs_faq.html#why

I'm not denying influences of other O-O languages (we could probably
put Algol, Simula, and Smalltalk among others in this list). The
connection I was drawing between C++ references and C's concept of
'call-by-reference' (let alone asm's) was merely provided as support
for the mental image I have of references as pointers. At the very
least, I think I've provided as much rationale for thinking of
references as pointers as anyone has for 'aliases'. (Please see the
wiki page on 'references' if you do not agree with anything I'm saying
And we all know that Wikipedia is a fantastically reliable reference.
Since there happens to be at the very least 3 different pages about
references on Wikipedia I assume that you meant the page about C++
references.
about them and also note that the word 'alias' was never mentioned
even once on that page although the obvious understanding of
references as pts 'under the hood' was put forth on multiple
occasions).
A Wikipedia article on the subject does not use the word alias, so what?
If that somehow strengthens your argument then the fact that Stroustrup
uses the word alias in TC++PL should be quite devastating. And by the
way, the word alias was used when I check (and no, I did not edit the page).
The alias supporters out there have quite an uphill
battle in front of them it seems (I'm surely not the only one who
'thinks' this way. And who's the thought police anyway?)
What battle? What makes you think that there is any battle going on?
Furthermore, if we're going to get really technical about this, a
pointer is a type of reference. The word 'reference' is the most
general word (of a list of many) in common usage to describe the
concept. So, reference swallows pointer, and alias, and moniker, and
handle, and so forth.
That depends on what you mean when you use the work reference, if you
use it as it is used in normal talk then yes. However if you by
reference mean a C++ reference then obviously no.
And, the concept of 'class' ultimately came from set theory so if C++
didn't directly take the idea from 'pure CS' (or mathematics), then it
did so indirectly as Algol, Smalltalk, Simula and others (whatever the
first O-O languages were) took it from the math literature.
Oh please, and where do you think the mathematicians got the word from?
The word class have been used to describe classes long before set theory
was invented. The word concept of a class in C++ was named so because
that was the way it was called in earlier languages, they used the word
because it described a similar the concept in math, which used the name
because it described a similar concept in normal speech. Perhaps I
missed some steps, but only the fist is really of interest.

--
Erik Wikström
Jan 12 '08 #56
johanatan <jo*******@gmail.comwrote:
I think the only way to really say what influenced the C++ reference
would be to ask the designer himself.
No need, he published his reason for introducing the concept
(http://www.research.att.com/~bs/dne.html.)

The reference was introduce to facilitate operator overloading. [full
stop]
Jan 13 '08 #57
johanatan <jo*******@gmail.comwrote:
The point was simply provided as justification for the mental model I
have of reference as pointers 'under the hood'.
Here's justification for you...

The obvious implementation of a reference is as a (constant) pointer
that is dereferenced each time it is used. It doesn't do much harm
thinking about references that way, as long as one remembers that a
reference isn't an object that can be manipulate the way a pointer
is...

In some cases, the compiler can optimize away a reference so that
there is no object representing the reference at run-time.
--Stroustrup TC++PL

It seems to me that references are necessary for operator overloading
and copy construction, but otherwise constant pointers could conceivably
be used instead. Stroustrup notes that "The main use of references is
for specifying arguments and return values for functions..." and I tend
to use them in those contexts only, as an optimization over passing by
value.
Jan 13 '08 #58
On Jan 12, 2:18*pm, Erik Wikström <Erik-wikst...@telia.comwrote:
I'm not denying influences of other O-O languages (we could probably
put Algol, Simula, and Smalltalk among others in this list). *The
connection I was drawing between C++ references and C's concept of
'call-by-reference' (let alone asm's) was merely provided as support
for the mental image I have of references as pointers. *At the very
least, I think I've provided as much rationale for thinking of
references as pointers as anyone has for 'aliases'. *(Please see the
wiki page on 'references' if you do not agree with anything I'm saying

And we all know that Wikipedia is a fantastically reliable reference.
Since there happens to be at the very least 3 different pages about
references on Wikipedia I assume that you meant the page about C++
references.
No, I meant this one. The one I pasted previously to support my
position (and if you read that entire post, maybe 3 or 4 back, you
might just understand my position.

http://en.wikipedia.org/wiki/Referen...ter_science%29
about them and also note that the word 'alias' was never mentioned
even once on that page although the obvious understanding of
references as pts 'under the hood' was put forth on multiple
occasions).

A Wikipedia article on the subject does not use the word alias, so what?
If that somehow strengthens your argument then the fact that Stroustrup
uses the word alias in TC++PL should be quite devastating. And by the
way, the word alias was used when I check (and no, I did not edit the page).
I think the point is that the 'pure CS' literature has a definition of
a general concept called a 'reference'. Please read the wiki page.
And, if wiki isn't authoritative enough, then go back to the CS
textbooks.
The alias supporters out there have quite an uphill
battle in front of them it seems (I'm surely not the only one who
'thinks' this way. *And who's the thought police anyway?)

What battle? What makes you think that there is any battle going on?
Well, I was corrected very near the beginning of this post for
thinking of references as pointers. And, the Stanford link I sent had
quite a bias towards 'black-box' view of references (and even went so
far as to say that students 'never need to worry about the details'.
As I mentioned, I would have expected this much from a Java school,
but from a C++ school?
Furthermore, if we're going to get really technical about this, a
pointer is a type of reference. *The word 'reference' is the most
general word (of a list of many) in common usage to describe the
concept. *So, reference swallows pointer, and alias, and moniker, and
handle, and so forth.

That depends on what you mean when you use the work reference, if you
use it as it is used in normal talk then yes. However if you by
reference mean a C++ reference then obviously no.
You really need to read the wiki page I referenced. (And please
realize it is simply a summary of the concept of 'reference' in CS).
I bet it even has 'references' at the bottom of the page if you think
it was produced by people with an agenda.
And, the concept of 'class' ultimately came from set theory so if C++
didn't directly take the idea from 'pure CS' (or mathematics), then it
did so indirectly as Algol, Smalltalk, Simula and others (whatever the
first O-O languages were) took it from the math literature.

Oh please, and where do you think the mathematicians got the word from?
Philosophy. That's why I mentioned Plato and Aristotle too.
The word class have been used to describe classes long before set theory
was invented. The word concept of a class in C++ was named so because
that was the way it was called in earlier languages, they used the word
because it described a similar the concept in math, which used the name
because it described a similar concept in normal speech. Perhaps I
missed some steps, but only the fist is really of interest.
Yea, my point exactly. Even if C++ got the name from Algol or Simula,
those most likely and maybe even indirectly got the name from set
theory who in turn probably got it from philosophy.

I suppose which of those steps (or influences) are of interest depends
entirely on how abstract you want to think.

--Jonathan
Jan 13 '08 #59
On Jan 12, 8:31*pm, "Daniel T." <danie...@earthlink.netwrote:
johanatan <johana...@gmail.comwrote:
I think the only way to really say what influenced the C++ reference
would be to ask the designer himself.

No need, he published his reason for introducing the concept
(http://www.research.att.com/~bs/dne.html.)

The reference was introduce to facilitate operator overloading. [full
stop]
I mean the influence of the name 'reference'. Is it as I suspect
because that is the purest term in CS literature (and also because
other languages had a concept of 'pass-by-reference')?

Operator overloading only explains the influence for introducing the
construct (or motivation for inventing it)--but not for naming it.

--Jonathan
Jan 13 '08 #60
On Jan 12, 9:05*pm, "Daniel T." <danie...@earthlink.netwrote:
johanatan <johana...@gmail.comwrote:
The point was simply provided as justification for the mental model I
have of reference as pointers 'under the hood'.

Here's justification for you...

* *The obvious implementation of a reference is as a (constant) pointer
* *that is dereferenced each time it is used. It doesn't do much harm
* *thinking about references that way, as long as one remembers that a
* *reference isn't an object that can be manipulate the way a pointer
* *is...

* *In some cases, the compiler can optimize away a reference so that
* *there is no object representing the reference at run-time.
* *--Stroustrup TC++PL

It seems to me that references are necessary for operator overloading
and copy construction, but otherwise constant pointers could conceivably
be used instead.
They surely could, but part of the point with references (at least in
practice) is the 'syntactic sugar'.
Stroustrup notes that "The main use of references is
for specifying arguments and return values for functions..." and I tend
to use them in those contexts only, as an optimization over passing by
value.
Yea, me too (but when I need to overload an operator, then I do so).
If Stroustrup says there's not much harm in thinking of them this way,
then how can anyone else say there is? I actually think there's no
harm because when in doubt, I look at the disassembly and in the
abstract, I've given plenty sufficient reasoning why this mental image
is perfectly acceptable.

But, let's suppose for a minute that the mental image were somehow
incorrect (the determination of which would obviously be somewhat
subjective) but in practice, it produced no discernible difference to
that which would be produced by the ideal mental image (assuming that
there is such a thing--see below for why there is not). In that case
would it really matter that the two mental models differed?

Besides that, an abstraction by definition ignores some of the
details. So there really is no such thing as a perfect abstraction.
Any two given abstractions are going to be less than ideal-- the only
way they differ is in what specific ways they are less than ideal (or
in what specific information they each lose).

--Jonathan
Jan 13 '08 #61
On Jan 12, 5:31 am, johanatan <johana...@gmail.comwrote:
On Jan 1, 4:38 am, James Kanze <james.ka...@gmail.comwrote:
On Jan 1, 3:38 am, johanatan <johana...@gmail.comwrote:
On Dec 31, 3:25 am, Erik Wikström <Erik-wikst...@telia.comwrote:
According to the Java Language Specification 3rd ed. §15.1:
When an expression in a program is evaluated (executed), the result
denotes one of three things:
* A variable (§4.12) (in C, this would be called an lvalue)
* A value (§4.2, §4.3)
* Nothing (the expression is said to be void)
A value is of course something of value-type (int, double etc.) and
while they are not like lvalues they are not rvalues either.
So, does that mean that an expression cannot evaluate to a
reference to an object?
Just the opposite. An expression in Java never evaluates to an
object type, only to a variable or a value. Variables and
values can have either a basic type or a reference type; they
are never objects.
Well, given that the 'variables' in Java (which are not values, or
intrinsics) are 'references' (usually to objects), I'd say I'm a
little confused by your remark. Would you mind to please clarify?
Just what don't you understand? An expression in Java never
evaluates to an object. An expression is always either a
reference or a basic types.

--
James Kanze (GABI Software) email:ja*********@gmail.com
Conseils en informatique orientée objet/
Beratung in objektorientierter Datenverarbeitung
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
Jan 13 '08 #62
On 2008-01-13 06:41, johanatan wrote:
On Jan 12, 2:18 pm, Erik Wikström <Erik-wikst...@telia.comwrote:
The alias supporters out there have quite an uphill
battle in front of them it seems (I'm surely not the only one who
'thinks' this way. And who's the thought police anyway?)

What battle? What makes you think that there is any battle going on?

Well, I was corrected very near the beginning of this post for
thinking of references as pointers. And, the Stanford link I sent had
quite a bias towards 'black-box' view of references (and even went so
far as to say that students 'never need to worry about the details'.
As I mentioned, I would have expected this much from a Java school,
but from a C++ school?
Because it gives an incomplete and somewhat incorrect picture. If you
think of them as pointers you might get the idea that you can perform
pointer arithmetic operations on them, or that they can be null. The
fact that references can be implemented in the same way as a pointer is
an implementation detail, and users really should not have to bother
about that.

Thinking of references as aliases is not perfect either, but it have
some advantages, it is not incorrect (merely incomplete), and does not
bother with implementation details. For someone new to C++ thinking
about references as aliases is better than thinking of them as crippled
pointers, until the time comes that they start to understand what
references really are.
Furthermore, if we're going to get really technical about this, a
pointer is a type of reference. The word 'reference' is the most
general word (of a list of many) in common usage to describe the
concept. So, reference swallows pointer, and alias, and moniker, and
handle, and so forth.

That depends on what you mean when you use the work reference, if you
use it as it is used in normal talk then yes. However if you by
reference mean a C++ reference then obviously no.
You really need to read the wiki page I referenced. (And please
realize it is simply a summary of the concept of 'reference' in CS).
I bet it even has 'references' at the bottom of the page if you think
it was produced by people with an agenda.
No I do not. My point was that you claim that a pointer is a reference,
I said that depends on what you mean with reference. A C++ pointer is
not a C++ reference, a C++ pointer is a kind of CS reference and a C++
reference is a kind of CS reference. And a CS reference is a kind of
everyday speech reference. Since we are in a C++ group, when you use the
word reference unqualified we will usually assume that you mean a C++
reference.
And, the concept of 'class' ultimately came from set theory so if C++
didn't directly take the idea from 'pure CS' (or mathematics), then it
did so indirectly as Algol, Smalltalk, Simula and others (whatever the
first O-O languages were) took it from the math literature.

Oh please, and where do you think the mathematicians got the word from?

Philosophy. That's why I mentioned Plato and Aristotle too.
Who in turn got it from?
>The word class have been used to describe classes long before set theory
was invented. The word concept of a class in C++ was named so because
that was the way it was called in earlier languages, they used the word
because it described a similar the concept in math, which used the name
because it described a similar concept in normal speech. Perhaps I
missed some steps, but only the fist is really of interest.

Yea, my point exactly. Even if C++ got the name from Algol or Simula,
those most likely and maybe even indirectly got the name from set
theory who in turn probably got it from philosophy.
Who in turn got it from?
I suppose which of those steps (or influences) are of interest depends
entirely on how abstract you want to think.
Right, but I believe that the discussion was about where Stroustrup got
the name, and then only the first step is of interest.

--
Erik Wikström
Jan 13 '08 #63
Erik Wikström <Er***********@telia.comwrote:
Thinking of references as aliases is not perfect either, but it have
some advantages, it is not incorrect (merely incomplete), and does not
bother with implementation details. For someone new to C++ thinking
about references as aliases is better than thinking of them as crippled
pointers, until the time comes that they start to understand what
references really are.
Funny though, at some point in their education. Someone new to C++ will
realize that references _are_ "crippled pointers" (that are
automatically dereferenced) and then comes the inevitable question of
why they are in the language. Then come these almost inevitable
discussions... :-)

Maybe if it was explained at the outset, that references are "crippled
pointers", useful precisely because they are "less powerful" (and
therefor more focused to a task,) people wouldn't get confused when they
come to the realization themselves.
Jan 13 '08 #64
On 2008-01-13 16:21, Daniel T. wrote:
Erik Wikstré—£ <Er***********@telia.comwrote:
>Thinking of references as aliases is not perfect either, but it have
some advantages, it is not incorrect (merely incomplete), and does not
bother with implementation details. For someone new to C++ thinking
about references as aliases is better than thinking of them as crippled
pointers, until the time comes that they start to understand what
references really are.

Funny though, at some point in their education. Someone new to C++ will
realize that references _are_ "crippled pointers" (that are
automatically dereferenced) and then comes the inevitable question of
why they are in the language. Then come these almost inevitable
discussions... :-)

Maybe if it was explained at the outset, that references are "crippled
pointers", useful precisely because they are "less powerful" (and
therefor more focused to a task,) people wouldn't get confused when they
come to the realization themselves.
That would require that you teach them what pointers are early on. If
you teach them references first they might instead come to consider
pointers as references on steroids.

--
Erik Wikström
Jan 13 '08 #65
Erik Wikström <Er***********@telia.comwrote:
Daniel T. wrote:
Erik Wikström <Er***********@telia.comwrote:
Thinking of references as aliases is not perfect either, but it
have some advantages, it is not incorrect (merely incomplete),
and does not bother with implementation details. For someone
new to C++ thinking about references as aliases is better than
thinking of them as crippled pointers, until the time comes
that they start to understand what references really are.
Funny though, at some point in their education. Someone new to
C++ will realize that references _are_ "crippled pointers" (that
are automatically dereferenced) and then comes the inevitable
question of why they are in the language. Then come these almost
inevitable discussions... :-)

Maybe if it was explained at the outset, that references are
"crippled pointers", useful precisely because they are "less
powerful" (and therefor more focused to a task,) people wouldn't
get confused when they come to the realization themselves.

That would require that you teach them what pointers are early on.
If you teach them references first they might instead come to
consider pointers as references on steroids.
Good point. I'm looking at this from the POV of a C programmer comming
to C++, and from the POV of standard instructional texts that do in fact
teach pointers first.
Jan 13 '08 #66
On Jan 13, 2:39*am, James Kanze <james.ka...@gmail.comwrote:
On Jan 12, 5:31 am, johanatan <johana...@gmail.comwrote:
On Jan 1, 4:38 am, James Kanze <james.ka...@gmail.comwrote:
On Jan 1, 3:38 am, johanatan <johana...@gmail.comwrote:
On Dec 31, 3:25 am, Erik Wikström <Erik-wikst...@telia.comwrote:
According to the Java Language Specification 3rd ed. §15.1:
* When an expression in a program is evaluated (executed), the result
* denotes one of three things:
* * * A variable (§4.12) (in C, this would be called an lvalue)
* * * A value (§4.2, §4.3)
* * * Nothing (the expression is said to be void)
A value is of course something of value-type (int, double etc.) and
while they are not like lvalues they are not rvalues either.
So, does that mean that an expression cannot evaluate to a
reference to an object?
Just the opposite. *An expression in Java never evaluates to an
object type, only to a variable or a value. *Variables and
values can have either a basic type or a reference type; they
are never objects.
Well, given that the 'variables' in Java (which are not values, or
intrinsics) are 'references' (usually to objects), I'd say I'm a
little confused by your remark. *Would you mind to please clarify?

Just what don't you understand? *An expression in Java never
evaluates to an object. *An expression is always either a
reference or a basic types.
Well, please re-read my question. I will quote here again for
clarity:
A value is of course something of value-type (int, double etc.) and
while they are not like lvalues they are not rvalues either.
So, does that mean that an expression cannot evaluate to a
reference to an object?
The question was 'can an expression evaluate to a reference to an
object'? The answer seems to me to be yes, and also seem to be
according to you. So, you can do something like:

class derived : public object {}

class mainClass
{
derived f( derived one, derived two )
{
//do some sort of combination involving omitted derived members
derived retVal = one.somemethod(two);
return retVal;
}

int main()
{
derived one, two, three;
// initialize one, two, and three
derived four = f(f(one, two), three);
}
}

So, the result of f is a variable (a reference variable) and is thus
an 'l-value' (according to the Java definition of such). As far as I
know this is no problem in Java, but in C++, the result of f(one, two)
would be an r-value and could not thus be passed to the outer f(rval,
three)

Is that not correct?

--Jonathan
Jan 15 '08 #67
In article <6d9e500e-3ad0-42c4-89c6-
e1**********@f10g2000hsf.googlegroups.com>, jo*******@gmail.com says...

[ ... ]
Another thought has occurred about the '&'-- why do you think the
same symbol was used for the 'address-of' operator and the 'reference'
marker? It seems quite a strange coincidence when there are still
other perfectly usable symbols unused in C++ (take $ for instance).
The dollar sign is part of ASCII, but has never been part of C or C++
basic character set. That symbol is also absent from quite a few other
character sets, such as ISO 646. C++ has gone to a fair amount of effort
to include features (trigraphs, digraphs, alternate symbols) to allow
writing its source code with restricted character sets, so using this
symbol would be _quite_ a strange addition to the language.

There seems to be quite a bit of resistance to adding any new/different
symbol or key word to the language, even when/if avoiding it means
adding a new meaning to one that already means too much.

--
Later,
Jerry.

The universe is a figment of its own imagination.
Jan 15 '08 #68
On Jan 14, 8:38*pm, Jerry Coffin <jcof...@taeus.comwrote:
In article <6d9e500e-3ad0-42c4-89c6-
e147d1318...@f10g2000hsf.googlegroups.com>, johana...@gmail.com says...

[ ... ]
Another thought has occurred about the '&'-- *why do you think the
same symbol was used for the 'address-of' operator and the 'reference'
marker? *It seems quite a strange coincidence when there are still
other perfectly usable symbols unused in C++ (take $ for instance).

The dollar sign is part of ASCII, but has never been part of C or C++
basic character set. That symbol is also absent from quite a few other
character sets, such as ISO 646. C++ has gone to a fair amount of effort
to include features (trigraphs, digraphs, alternate symbols) to allow
writing its source code with restricted character sets, so using this
symbol would be _quite_ a strange addition to the language.

There seems to be quite a bit of resistance to adding any new/different
symbol or key word to the language, even when/if avoiding it means
adding a new meaning to one that already means too much.

Well, there are other already plenty of other used symbols too (some
of which are much less 'overloaded'). ?, %, ^, and ~ come to mind.
Those could just as easily been overloaded as & (who's already serving
triple-duty and now quad-duty with 'address-of', reference, bitwise-
AND, and && (if you include && in your count).

Maybe there's a connection there between 'address-of' and reference.
I know there's always been one in my mind, but it is true that this
could have not been the intent of the designer (much as it came to be
used as 'syntactic sugar' more than its original purpose for operator
overloading). But, I find it a pretty convincing coincidence.

--Jonathan
Jan 16 '08 #69

This discussion thread is closed

Replies have been disabled for this discussion.

By using this site, you agree to our Privacy Policy and Terms of Use.