Malcolm wrote:
"BRG" <br*@nowhere.orgwrote
>The original poster claimed that the redefinition of types to obtain
types of guaranteed length or precision was based on the false
assumption that this makes their code more portable.
I pointed out _only_ that their assumption would not _always_ be false.
I am confused by your post because you respond 'No' to this point but
then go on to give details that are inconsistent with this response.
You asserted a negative so "no" signifies concurrence.
Ok, thanks, I did not read it this way.
Redefining every basic type can make code more portable, if you really know
what you are doing and the code is a self-contained unit.
Or, as with many cryptographic algorithms, when the specification of an
algorithm calls for a specific width type. This will often mean that
what has to be typed as an integer on one class of machine will have to
be typed as a long integer on another machine class. On some systems it
can even mean that the type has to be switched when the compiler flags
are switched.
Generally it is a bad idea that makes code hard to reuse, maybe even hard to
port.
I agree that this is not always good and not always bad in portability
terms. Whether there are portability benefits or a detrimental impact on
portability in doing this is not something that can be stated as an
absolute fact because it is dependent on the applications context.
Which is why I contested the original claim.
Brian Gladman