In article <11*********************@g14g2000cwa.googlegroups. com>,
<ra***********@gmail.com> wrote:
I am not sure the question which I am asking is correct or wrong, but
I have heard that storing the data into the big endian helps in
gting the more transfer rate, Means we can achive good speed by
soring the data in the BIG ENDIAN while transferring, If this is
correct, then how it is possible and what is the reason behind this.
Unless perhaps there is some data compression equipment inline
that uses poor algorithms, there should be pretty much no
measurable transmission speed difference for big endian or
little endian data.
However, big endian is the standard "network byte order", so
if you are trying to transfer numeric fields and you are
on a little-endian machine, there is less preperation work
to be done if you are starting from data that is internally
stored as big endian. This would, though, not make any noticable
difference to data tranmission speeds unless your processor
is having trouble keeping up with the network bandwidth
(which would normally only happen with either a very fast and
short link, or else a very slow processor.)
In most cases, the procssor speed is sufficient to keep the
network busy either way. Even with very fast links, the
limitation is usually not the processor speed but rather the
bandwidth of the bus to the network card.
There is one other situation where the transmission endianess can
make a difference: if you are transmitting in the opposite
endianness than the reciever needs, and the receiving application
is having a hard time keeping up with decoding the tranmissions,
then putting the data into the receiver's native format could
-potentially- reduce the load on the receiver by enough to make a
difference in whether the recieving machine's buffers fill up
and hence whether the receiving machine drops packets. If you
are in such a situation, you are in a very fragile situation
and random events unrelated to your processing could lead to
massive performance problems for you.
--
This signature intentionally left... Oh, darn!