Well i couldn`t explain it but i noticed in the past that my 64 bit
capable app ( compile with the ALL flag win VS.Net 2003 ) running on a 64
bit operating system performed much bether
This was at that time on windows server 2003 X64 on a Acer 1524 WLMI ( AMD
X64 ) with 2 GB memory
So i guess the 64 bit framework did generate bether optimized code by the
way the project was a Business logic server wich implmeneted a .Net remoting
interface
that queryed a lot of data and in the end sended this to connected clients
through a TCP channel
regards
Michel
"Family Tree Mike" <Fa************@discussions.microsoft.comschreef in
bericht news:6E**********************************@microsof t.com...
I'm pretty sure if the specs for the system are the same, just that one
system is 64 bit versus 32 bit, that you won't see a difference. Where
you
could see a difference is in a code where you require the large address
size
and code using available memory rather than segmenting the problem to
small
chunks of memory. That isn't a processor difference, but an algorithm
difference.
"PJ6" wrote:
>This is probably an old question but Google isn't giving me anything
useful.
Can someone point me to any articles written on the performance
differences
for .Net applications compiled for and running on 32 bit vs 64 bit
systems?
Thanks,
Paul