Greetings!
I am trying to write a color class that is a bit more intelligent that
C#'s System.Color structure. This means that I'm using a bunch of
bitmasks. My class has the following method:
public static HTColor FromArgb(int argb)
{
int colorValue = 0;
int transparency = (argb & 0xff000000) >24;
int red = (argb & 0xff0000) >16;
int green = (argb & 0x00ff00) >8;
int blue = argb & 0x0000ff;
HTColor result;
result.m_netColor = Color.FromArgb(red, green, blue);
return result;
}
The transparency line is throwing an error claiming that I cannot
implicity convert an object of type 'long' to type 'int'.
But the C# standard for literals includes the following:
"1 The type of an integer literal is determined as follows:
2 If the literal has no suffix, it has the first of these types in
which its value can be represented: int, uint, long, ulong.
3 If the literal is suffixed by U or u, it has the first of these types
in which its value can be represented: uint, ulong.
4 If the literal is suffixed by L or l, it has the first of these types
in which its value can be represented: long, ulong.
5 If the literal is suffixed by UL, Ul, uL, ul, LU, Lu, lU, or lu, it
is of type ulong."
(quoted from
http://www.jaggersoft.com/csharp_sta...er-type-suffix)
Since ints and uints are at least 32 bits long and 0xff000000 is 32
bits long, it seems to me that this literal should be evaluated as an
int.
I get the same error if my literal is 0x8f000000, and I don't get the
error if my literal is 0x7f000000.
Thank you very much.
Rob Richardson
RAD-CON, Inc.
P.S. Yes, I know "transparency" is never used. I'm not sure if I even
want it, but I do want to understand the literal type issue it
illustrates.