Drew <so*****@hotmail.com> wrote:
Oh, it looks like I can just cast the char to an int.
Is that the best way to do this?
Yes, that's fine. Bear in mind that that will give you the Unicode
value, which will be the same as the ASCII value for all ASCII
characters. If you want to check whether or not it was actually an
ASCII character to start with, just check whether or not the value is
< 128.
--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too