By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,077 Members | 1,175 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,077 IT Pros & Developers. It's quick & easy.

Converting a single ASCII character to an int

P: n/a
I am a total newbie, trying to slog through the Visual C# Express
application. I need to be able to convert a single ASCII character (can be
anything from 0 to 255) to an int for use in other places. So far, I cannot
find anything that works. My application gets a string of characters from an
external device via the serial port. I can use the substring method to get
just one character from that input string, and I need to be able to convert
that character's ASCII value to an int. Is there a straightforward way to do
this, similar to atoi in C?
Aug 30 '06 #1
Share this Question
Share on Google+
6 Replies


P: n/a
"davetelling" <da*********@discussions.microsoft.comwrote in message
news:9C**********************************@microsof t.com...
>I am a total newbie, trying to slog through the Visual C# Express
application. I need to be able to convert a single ASCII character (can be
anything from 0 to 255) to an int for use in other places. So far, I
cannot
find anything that works. My application gets a string of characters from
an
external device via the serial port. I can use the substring method to get
just one character from that input string, and I need to be able to
convert
that character's ASCII value to an int. Is there a straightforward way to
do
this, similar to atoi in C?
string theString = "ABC";
int asciiA = theString[0];
int asciiB = theString[1];
int asciiC = theString[2];
Aug 30 '06 #2

P: n/a
*"davetelling" <da*********@discussions.microsoft.comwrote in message
news:9C**********************************@microsof t.com...
*I am a total newbie, trying to slog through the Visual C# Express
* application. I need to be able to convert a single ASCII character (can be
* anything from 0 to 255) to an int for use in other places. So far, I
cannot
* find anything that works. My application gets a string of characters from
an
* external device via the serial port. I can use the substring method to get
* just one character from that input string, and I need to be able to
convert
* that character's ASCII value to an int. Is there a straightforward way to
do
* this, similar to atoi in C?
*
davetelling,

If you have a System.String reference, you can use the String class' indexer
to get individual elements.
From here, you may look at using the System.Convert class, specifically its
ToUInt32 static method.

An example of usage:
----------------------

string input = "Hello World!";
uint code;

foreach( char c in input )
{
code = ConvertTo.UInt32(c);
System.Console.WriteLine("{0}'s integer value is {1}." , c , code );
}

-MH
Aug 30 '06 #3

P: n/a
davetelling,

In addition to the given answers, be aware that ASCII is a 7 bit character
set (0-127).

Therefore you will get seldom classified information in converting the non
official extended ASCII character set which were created in all kind of
tastes by Microsoft/IBM in the first days of the PC which was using an 8bit
processor.

Just as addition.

Cor

"davetelling" <da*********@discussions.microsoft.comschreef in bericht
news:9C**********************************@microsof t.com...
>I am a total newbie, trying to slog through the Visual C# Express
application. I need to be able to convert a single ASCII character (can be
anything from 0 to 255) to an int for use in other places. So far, I
cannot
find anything that works. My application gets a string of characters from
an
external device via the serial port. I can use the substring method to get
just one character from that input string, and I need to be able to
convert
that character's ASCII value to an int. Is there a straightforward way to
do
this, similar to atoi in C?

Aug 31 '06 #4

P: n/a
It is not necessary to Convert. As John pointed out, you can do this very
easily.
An int is a wider data type than char so it can hold a char with implicit
conversion.

char c = '';
int d = c;

Note also that ASCII is only characters 0-127, characters 128-255 varies
with code tables, and on my computer d == 248
--
Happy Coding!
Morten Wennevik [C# MVP]
Aug 31 '06 #5

P: n/a
I appreciate the various replies. In my application, the data coming in are
not really ASCII - they are just bytes that represent data values coming from
a device with a range of 0-255, so the simple method of int asciiA =
string[x]; seems to work fine. I will try the other methods to see if they
can be used in other areas, however.
Thanks again!

"Cor Ligthert [MVP]" wrote:
davetelling,

In addition to the given answers, be aware that ASCII is a 7 bit character
set (0-127).

Therefore you will get seldom classified information in converting the non
official extended ASCII character set which were created in all kind of
tastes by Microsoft/IBM in the first days of the PC which was using an 8bit
processor.

Just as addition.

Cor

"davetelling" <da*********@discussions.microsoft.comschreef in bericht
news:9C**********************************@microsof t.com...
I am a total newbie, trying to slog through the Visual C# Express
application. I need to be able to convert a single ASCII character (can be
anything from 0 to 255) to an int for use in other places. So far, I
cannot
find anything that works. My application gets a string of characters from
an
external device via the serial port. I can use the substring method to get
just one character from that input string, and I need to be able to
convert
that character's ASCII value to an int. Is there a straightforward way to
do
this, similar to atoi in C?


Aug 31 '06 #6

P: n/a
davetelling <da*********@discussions.microsoft.comwrote:
I appreciate the various replies. In my application, the data coming in are
not really ASCII - they are just bytes that represent data values coming from
a device with a range of 0-255
In that case you should read them as binary data (bytes) instead of
text data (chars). Don't convert them into text data at all. If you
arbitrarily convert binary data to text data, sooner or later you're
pretty much bound to run into issues.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
If replying to the group, please do not mail me too
Sep 4 '06 #7

This discussion thread is closed

Replies have been disabled for this discussion.