By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
457,696 Members | 1,486 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 457,696 IT Pros & Developers. It's quick & easy.

Converting byte[] to string

P: n/a
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String but the converted string is altered, so if I'm generating a new
wave file from that string, the continment is altered from it's original
state. Below is a code snipped I'm using:

// Here I read the wave file and I convert the result to a string
byte[] b = new byte[35000];
FileStream fs = File.OpenRead("test.wav");
int size = fs.Read(b, 0, b.Length);
string dataSample = Encoding.ASCII.GetString(b, 0, size);
// Here I generate a second wave file from the result but the generated file
contains noise and disturbances
FileStream f = File.Create("test2.wav");
f.Write(Encoding.ASCII.GetBytes(dataSample), 0, size);

I also tryied some other encoding types like UTF7, UTF8, Default and
ISO-8859-1 but the result is the same.
I'm using a function from an assembly which needs a String parameter as
input that should be a wave buffer.
Can somebody help me? What am I doing wrong? Is there another way to read a
binary file in a String object?
Nov 16 '05 #1
Share this Question
Share on Google+
8 Replies


P: n/a
Marius Cabas <ma**********@hotmail.com> wrote:
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String


That's a very bad idea. Strings are for text data. Wave files are
binary data. Just keep it in byte array format.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #2

P: n/a
Marius,

You can use Convert.ToBase64String & Convert.FromBase64String to correctly
encode and decode binary data in a string.

However, I suspect this is not really what you want to do. It looks like the
problem may be in the semantics of the method in the assembly you are trying
to call. Does this really expect a string? And if so, perhaps there is some
documentation somewhere which specifies what format this string should be. As
Jon said in the previous post, strings are not used to store binary data.

If there is still a problem, perhaps you can post some details of the method
& assembly you are trying to call.

Cheers,
Chris.

"Marius Cabas" wrote:
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String but the converted string is altered, so if I'm generating a new
wave file from that string, the continment is altered from it's original
state. Below is a code snipped I'm using:

// Here I read the wave file and I convert the result to a string
byte[] b = new byte[35000];
FileStream fs = File.OpenRead("test.wav");
int size = fs.Read(b, 0, b.Length);
string dataSample = Encoding.ASCII.GetString(b, 0, size);
// Here I generate a second wave file from the result but the generated file
contains noise and disturbances
FileStream f = File.Create("test2.wav");
f.Write(Encoding.ASCII.GetBytes(dataSample), 0, size);

I also tryied some other encoding types like UTF7, UTF8, Default and
ISO-8859-1 but the result is the same.
I'm using a function from an assembly which needs a String parameter as
input that should be a wave buffer.
Can somebody help me? What am I doing wrong? Is there another way to read a
binary file in a String object?

Nov 16 '05 #3

P: n/a
Hi,

Why are you converting a wave file that is a BINARY file to text ?

cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

"Marius Cabas" <ma**********@hotmail.com> wrote in message
news:uR**************@TK2MSFTNGP14.phx.gbl...
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String but the converted string is altered, so if I'm generating a new
wave file from that string, the continment is altered from it's original
state. Below is a code snipped I'm using:

// Here I read the wave file and I convert the result to a string
byte[] b = new byte[35000];
FileStream fs = File.OpenRead("test.wav");
int size = fs.Read(b, 0, b.Length);
string dataSample = Encoding.ASCII.GetString(b, 0, size);
// Here I generate a second wave file from the result but the generated file contains noise and disturbances
FileStream f = File.Create("test2.wav");
f.Write(Encoding.ASCII.GetBytes(dataSample), 0, size);

I also tryied some other encoding types like UTF7, UTF8, Default and
ISO-8859-1 but the result is the same.
I'm using a function from an assembly which needs a String parameter as
input that should be a wave buffer.
Can somebody help me? What am I doing wrong? Is there another way to read a binary file in a String object?

Nov 16 '05 #4

P: n/a
because I have to pass the binary stream to an SSL socket function that
takes a String object as a parameter.

"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote
in message news:#j**************@TK2MSFTNGP14.phx.gbl...
Hi,

Why are you converting a wave file that is a BINARY file to text ?

cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

"Marius Cabas" <ma**********@hotmail.com> wrote in message
news:uR**************@TK2MSFTNGP14.phx.gbl...
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result to String but the converted string is altered, so if I'm generating a new wave file from that string, the continment is altered from it's original
state. Below is a code snipped I'm using:

// Here I read the wave file and I convert the result to a string
byte[] b = new byte[35000];
FileStream fs = File.OpenRead("test.wav");
int size = fs.Read(b, 0, b.Length);
string dataSample = Encoding.ASCII.GetString(b, 0, size);
// Here I generate a second wave file from the result but the generated file
contains noise and disturbances
FileStream f = File.Create("test2.wav");
f.Write(Encoding.ASCII.GetBytes(dataSample), 0, size);

I also tryied some other encoding types like UTF7, UTF8, Default and
ISO-8859-1 but the result is the same.
I'm using a function from an assembly which needs a String parameter as
input that should be a wave buffer.
Can somebody help me? What am I doing wrong? Is there another way to

read a
binary file in a String object?


Nov 16 '05 #5

P: n/a
Yeah, I know this but I have to do it because I'm using an assembly wrote by
a third party. This assembly contains an SSL socket class that takes a
String object as a parameter. This String object keeps the data to send over
TCP/IP via SSL. I have no choice :(

"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
Marius Cabas <ma**********@hotmail.com> wrote:
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result to String


That's a very bad idea. Strings are for text data. Wave files are
binary data. Just keep it in byte array format.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 16 '05 #6

P: n/a
Marius Cabas <ma**********@hotmail.com> wrote:
Yeah, I know this but I have to do it because I'm using an assembly wrote by
a third party. This assembly contains an SSL socket class that takes a
String object as a parameter. This String object keeps the data to send over
TCP/IP via SSL. I have no choice :(


Hmm... I would contact the third party and check this. SSL is designed
for streams really - there's no justifiable reason why you *should*
have to specify everything in terms of strings. It's just asking for
trouble.

Are you able to specify the encoding the SSL code will use?

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #7

P: n/a

"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message

Are you able to specify the encoding the SSL code will use?


No, I have no control. I can only connect to a remote host using a por
number and I can set the certificates. Afterwards, I can read and write data
from/to the socket.
Nov 16 '05 #8

P: n/a
Marius Cabas <ma**********@hotmail.com> wrote:
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
Are you able to specify the encoding the SSL code will use?


No, I have no control. I can only connect to a remote host using a por
number and I can set the certificates. Afterwards, I can read and write data
from/to the socket.


And you can only read/write data from/to the socket in string form?
What a terrible interface.

Basically, you won't be able to transfer binary data correctly unless
you can use something like Base64 encoding at both ends. If you don't
have control over the other end, you're stuffed.

Is there any way you can ditch this library and use a different one? It
sounds awful...

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #9

This discussion thread is closed

Replies have been disabled for this discussion.