They read a Unicode char, and then determine the actual byte sequence
in whatever encoding you choose. For instance, if you use ASCII or
UTF-8 and pass "hi", it'll make a byte[] {0x68, 0x69}. If using
Unicode, it'd be 0x68, 0x00, 0x69, 0x00.
The same applies when you get chars or a string from bytes. It reads
the bytes and then determines what actual Unicode characters they are.
Also, there is no Encoding.GetCha rs(string) method (since it'd only
return chars of a string, which would always be the .NET internal
representation (Unicode)).
-mike
MVP
"Viorel" <vm*********@mo ldova.cc> wrote in message
news:eP******** *****@TK2MSFTNG P10.phx.gbl...
For me is a little bit mysterious how work encoding and decoding
functions, what is underneath of their calling?
Encoding1.GetBy tes(string1); in particularly ASCII.GetBytes( string1)
Encoding1.GetCh ars(string1);
Encoding1.GetCh ars(arrayofbyte s1);
string1=Encodin g1.GetString(ar rayofbytes1);
If I know (perhaps) that a char is based on 2 bytes (16 bits)
and all Strings in C#(NET) are a set of chars
P.S. Please explain on plane of working with bytes (I come from C
world)
I will appreciate