471,342 Members | 1,687 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,342 software developers and data experts.

receiving a C struct via sockets

I am sending a C struct from a C app via sockets to a C# server. I
occasionally get an error when reading a C char array using the
BinaryReader Class. the error is:

"The output char buffer is too small to contain the decoded characters"

The code that reads a C string looks like this (br is a binaryreader)
private string ReadString(int maxLen)
{
char[] str = br.ReadChars(maxLen);
int i = 0;
while (str[i] != 0)
i++;
return new string(str, 0, i);
}

I'm pretty sure what is happening here is I have garbage in the end of
the C char array after the 0 NULL terminator and C# is trying to encode
these garbage values.

My question is what is the best way for me to read this 40 bytes in and
converting it to a C# string only to where the NULL(0) byte is in the C
char array?
Thanks!
Sep 9 '08 #1
10 1549
Would setting the Encoding of the BinaryReader to AsciiEncoding fix this?
Sep 9 '08 #2
On Mon, 08 Sep 2008 21:51:48 -0700, Mike Margerum <ju**@mail.comwrote:
Would setting the Encoding of the BinaryReader to AsciiEncoding fix this?
I doubt it. But it's so simple for you to check yourself, I'm not sure
why you posted that part of the question here.

That said, if the encoding for the data is ASCII, then you should
definitely be setting the encoding appropriate to that. The default is
UTF-8 if I recall correctly, and that will deal with ASCII data too just
because of the way the encoding overlap. But IMHO it's a better practice
to set the encoding appropriately. If nothing else, it will make
corrupted data easier to detect, because the encoder will know to not
expect anything _but_ valid ASCII.

Now, as far as the actual situation goes, if your string is variable
length and delimited by a null terminator, then you will have to look for
the null terminator in the _bytes_ that were read before converting, and
then pass only the bytes known to be part of the string to an encoder for
conversion to C# characters (UTF-16).

Of course, if the string is constant length, then just read that length.
:)

Pete
Sep 9 '08 #3
Mike Margerum wrote:
I am sending a C struct from a C app via sockets to a C# server. I
occasionally get an error when reading a C char array using the
BinaryReader Class. the error is:

"The output char buffer is too small to contain the decoded characters"

The code that reads a C string looks like this (br is a binaryreader)
private string ReadString(int maxLen)
{
char[] str = br.ReadChars(maxLen);
int i = 0;
while (str[i] != 0)
i++;
return new string(str, 0, i);
}

I'm pretty sure what is happening here is I have garbage in the end of
the C char array after the 0 NULL terminator and C# is trying to encode
these garbage values.
I would write an extension method, or write a descendant of
BinaryReader, that directly read bytes from BaseStream until the 0 byte,
and do the conversion from that; either a simple conversion for ASCII,
or by buffering in a MemoryStream, List<byteor similar and thence
through an Encoding.

I'd also make sure the BaseStream is a BufferedStream to avoid
performance costs of calling e.g. a NetworkStream looking for single
bytes.

-- Barry

--
http://barrkel.blogspot.com/
Sep 9 '08 #4
Thanks barry.

If I read the data into a byte array (using BinaryReader.ReadBytes)
instead of a char array would I still have encoding issues?

If not , ill go the route you described below.

Thanks

Barry Kelly wrote:
Mike Margerum wrote:
>I am sending a C struct from a C app via sockets to a C# server. I
occasionally get an error when reading a C char array using the
BinaryReader Class. the error is:

"The output char buffer is too small to contain the decoded characters"

The code that reads a C string looks like this (br is a binaryreader)
private string ReadString(int maxLen)
{
char[] str = br.ReadChars(maxLen);
int i = 0;
while (str[i] != 0)
i++;
return new string(str, 0, i);
}

I'm pretty sure what is happening here is I have garbage in the end of
the C char array after the 0 NULL terminator and C# is trying to encode
these garbage values.

I would write an extension method, or write a descendant of
BinaryReader, that directly read bytes from BaseStream until the 0 byte,
and do the conversion from that; either a simple conversion for ASCII,
or by buffering in a MemoryStream, List<byteor similar and thence
through an Encoding.

I'd also make sure the BaseStream is a BufferedStream to avoid
performance costs of calling e.g. a NetworkStream looking for single
bytes.

-- Barry
Sep 9 '08 #5
Peter Duniho wrote:
On Mon, 08 Sep 2008 21:51:48 -0700, Mike Margerum <ju**@mail.comwrote:
>Would setting the Encoding of the BinaryReader to AsciiEncoding fix this?

I doubt it. But it's so simple for you to check yourself, I'm not sure
why you posted that part of the question here.
It's not happening in my test environment and i'm not going to
experiment on the production server.
Now, as far as the actual situation goes, if your string is variable
length and delimited by a null terminator, then you will have to look
So do you think i can use the ReadBytes BinaryReader to reads the entire
array into a byte buffer and then look for the NULL?
Thanks
Sep 9 '08 #6
Mike Margerum wrote:
Thanks barry.

If I read the data into a byte array (using BinaryReader.ReadBytes)
As long as the C-side is sending out a fixed buffer, that's fine, I'm
sure. I was assuming the worst case: an unknown number of bytes
terminated by null.
instead of a char array would I still have encoding issues?

If not , ill go the route you described below.

Thanks

Barry Kelly wrote:
Mike Margerum wrote:
I am sending a C struct from a C app via sockets to a C# server. I
occasionally get an error when reading a C char array using the
BinaryReader Class. the error is:

"The output char buffer is too small to contain the decoded characters"

The code that reads a C string looks like this (br is a binaryreader)
private string ReadString(int maxLen)
{
char[] str = br.ReadChars(maxLen);
int i = 0;
while (str[i] != 0)
i++;
return new string(str, 0, i);
}

I'm pretty sure what is happening here is I have garbage in the end of
the C char array after the 0 NULL terminator and C# is trying to encode
these garbage values.
I would write an extension method, or write a descendant of
BinaryReader, that directly read bytes from BaseStream until the 0 byte,
and do the conversion from that; either a simple conversion for ASCII,
or by buffering in a MemoryStream, List<byteor similar and thence
through an Encoding.

I'd also make sure the BaseStream is a BufferedStream to avoid
performance costs of calling e.g. a NetworkStream looking for single
bytes.

-- Barry
-- Barry

--
http://barrkel.blogspot.com/
Sep 9 '08 #7
On Tue, 09 Sep 2008 07:02:29 -0700, Mike Margerum <ju**@mail.comwrote:
Peter Duniho wrote:
>On Mon, 08 Sep 2008 21:51:48 -0700, Mike Margerum <ju**@mail.comwrote:
>>Would setting the Encoding of the BinaryReader to AsciiEncoding fix
this?
I doubt it. But it's so simple for you to check yourself, I'm not
sure why you posted that part of the question here.

It's not happening in my test environment and i'm not going to
experiment on the production server.
All due respect, unless you come up with a repro case on your test
environment, you're _are_ going to test any change that you implement on
the production server.

If you care about avoiding testing on the production server (and IMHO
that's a wise thing to care about), you need to force the issue on the
test server. It's true, in some cases (and perhaps this one) that can be
non-trivial. But it's something you just have to do. Otherwise you are
testing on the production server.
>Now, as far as the actual situation goes, if your string is variable
length and delimited by a null terminator, then you will have to look

So do you think i can use the ReadBytes BinaryReader to reads the entire
array into a byte buffer and then look for the NULL?
Personally, I would change the code so that you are not using
BinaryReader, and just reading data from the socket straight into a
byte[]. Do all the processing on the data _after_ you get it from the
socket.

You can use BinaryReader.ReadBytes() if you want, but it won't return
until it's either read the number of bytes you asked for, or it reaches
the end of the stream. Either way, you are likely to at some point have
to deal with bytes you've already read from the stream, returned by
BinaryReader.ReadBytes() but needed for processing as something other than
the string you're working on. It's my opinion that all of that is only
made more complicated through the use of BinaryReader.

BinaryReader can be very useful when you know in advance the exact size of
everything you're reading. But I think in this situation, it's probably
counter-productive.

Pete
Sep 9 '08 #8
All due respect, unless you come up with a repro case on your test
environment, you're _are_ going to test any change that you implement on
the production server.
With all due respect, i've been doing c/c++ development for 18 years and
really don't need advice on testing/deployment methodologies. About
12 of them years involve sockets/middleware development. You have no
idea how I have things setup so kindly stay on topic here.

I can turn this code of on my middleware and revert back to FTP. There
are other platforms this middleware is serving well.

Personally, I would change the code so that you are not using
BinaryReader, and just reading data from the socket straight into a
byte[]. Do all the processing on the data _after_ you get it from the
socket.
I'll look into this. Thanks. Part of the problem is i'm obviously new
to C# and do not have my head wrapped the enormous library.
You can use BinaryReader.ReadBytes() if you want, but it won't return
until it's either read the number of bytes you asked for, or it reaches
the end of the stream. Either way, you are likely to at some point have
I *am* reading a fixed size array. The data will always be 40 bytes.
the string itself is variable in size because of a null pointer.
>
BinaryReader can be very useful when you know in advance the exact size
of everything you're reading. But I think in this situation, it's
probably counter-productive.
I do know the exact size and this is what is puzzling me. It reads fine
100% of the time for me but I am getting the error from folks in the
field.

Sep 9 '08 #9
As long as the C-side is sending out a fixed buffer, that's fine, I'm
sure. I was assuming the worst case: an unknown number of bytes
terminated by null.
Yes it is fixed size Barry. I think the readBytes is going to work in
this instance. I'll keep you posted.
Thanks so much for the help.
Sep 9 '08 #10
reading the c char array in as a bytes array using binaryRead.ReadBytes
seems to have solved the problem.

Thanks very much for the help
Sep 10 '08 #11

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

3 posts views Thread by Zunbeltz Izaola | last post: by
5 posts views Thread by Richard Harris | last post: by
6 posts views Thread by Laxmikant Rashinkar | last post: by
4 posts views Thread by WAkthar | last post: by
1 post views Thread by verge | last post: by
4 posts views Thread by Chandra | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.