"aling" <li*********@163.com> wrote in message
news:11**********************@g14g2000cwa.googlegr oups.com
Given following code snippets:
int a[10];
int* p=(int*)((&a)+1);
when I debuged the code using IDE like VC7.1, I found that:
a = 0x0012fc50 (int [10])
p = 0x0012fc78 (int *)
(&a)+1 = 0x0012fc51 (char *)
(&a)+2 = 0x0012fc52 (char *)
(&a)+3 = 0x0012fc53 (char *)
Why p isn't equal to (&a)+1 ? Why "(&a)+1" became "char*" type?
I take it that you have typed (&a)+1 into the Watch window of the debugger.
The debugger's ability to display values is not perfect (it doesn't always
understand data types that you enter) and what you are seeing is purely a
debugger limitation. For what it is worth, a pre-release version of VC++ 8
shows (&a)+1 correctly and equal to p.
--
John Carson