By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,276 Members | 1,494 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,276 IT Pros & Developers. It's quick & easy.

reading strings from binary files - performance issue

P: n/a
I am working on a code library which needs to read in the data from
large binary files. The files hold int, double and string data. This
is the code for reading in the strings:

protected internal override string ReadString()
{
stringLength = fileStream.ReadByte();
moInput.Read(byteArrayBuffer, 0, stringLength);
return asciiEncoding.GetString(byteArrayBuffer, 0, stringLength );
}

At the moment the code that reads in the binary file data is
unacceptably slow (it takes several minutes), and most of the time
(according to my profiler) is taken up by the thousands of calls to
this ReadString function. Within this function, it is the third line -
the call to asciiEncoding.GetString() - that is taking up all the time.
Is there a way I can rewrite this line, or the whole function, to
speed the process up?

I have total control over the format of the binary files, so I can
change the way the string length and/or the string data itself is
encoded in the binary file. Since this is for a code library, I would
like the data to be saved and read out in a way that will work
correctly on machines with different language and format settings, etc.
- it would also be an advantage to support unicode, though I found the
unicode Encoding object's GetString method to be even slower.
Thanks for any help in advance,

Richard

Nov 16 '05 #1
Share this Question
Share on Google+
5 Replies


P: n/a
Sorry, the code should be:
protected internal override string ReadString()
{
stringLength = fileStream.ReadByte();
fileStream.Read(byteArrayBuffer, 0, stringLength);
return asciiEncoding.GetString(byteArrayBuffer, 0, stringLength );
}


That'll teach me to rename my variables to try and explain their
purpose better!

int stringLength
System.IO.Stream fileStream
System.Text.Encoding asciiEncoding
byte[] byteArrayBuffer
are all class level private variables.

Cheers,

Richard

Nov 16 '05 #2

P: n/a
<rn********@hotmail.com> wrote:
I am working on a code library which needs to read in the data from
large binary files. The files hold int, double and string data. This
is the code for reading in the strings:

protected internal override string ReadString()
{
stringLength = fileStream.ReadByte();
moInput.Read(byteArrayBuffer, 0, stringLength);
return asciiEncoding.GetString(byteArrayBuffer, 0, stringLength );
}

At the moment the code that reads in the binary file data is
unacceptably slow (it takes several minutes), and most of the time
(according to my profiler) is taken up by the thousands of calls to
this ReadString function. Within this function, it is the third line -
the call to asciiEncoding.GetString() - that is taking up all the time.
Is there a way I can rewrite this line, or the whole function, to
speed the process up?

I have total control over the format of the binary files, so I can
change the way the string length and/or the string data itself is
encoded in the binary file. Since this is for a code library, I would
like the data to be saved and read out in a way that will work
correctly on machines with different language and format settings, etc.
- it would also be an advantage to support unicode, though I found the
unicode Encoding object's GetString method to be even slower.
Thanks for any help in advance,


Just how large are these files? ASCIIEncoding.GetString should be very
fast indeed...

Could you post a short but complete program which demonstrates the
problem?

See http://www.pobox.com/~skeet/csharp/complete.html for details of
what I mean by that.

Note that you should really use the return result of Stream.Read to
check how much data has *actually* been read.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #3

P: n/a
Thanks for your reply.
Just how large are these files? ASCIIEncoding.GetString should be very fast indeed...
The largest of the files I am working with is around 31Mb - the average
is more like 8Mb.
Could you post a short but complete program which demonstrates the
problem?

See http://www.pobox.com/~skeet/csharp/complete.html for details of
what I mean by that.
I read your page and wrote a short program. I'll post again in a while
- I am still investigating. Next time, I will do what you suggest and
write the short test program *before* posting.
Note that you should really use the return result of Stream.Read to
check how much data has *actually* been read.


Yes, I know. I took out all the error handling to try and speed the
thing up, based on the fact that since I am writing the files and know
whether the write succeeded, the read should work correctly as it is
the mirror of the write.

Thanks,

Richard

Nov 16 '05 #4

P: n/a
<rn********@hotmail.com> wrote:
Just how large are these files? ASCIIEncoding.GetString should be
very fast indeed...


The largest of the files I am working with is around 31Mb - the average
is more like 8Mb.


That shouldn't take long, certainly. I quick test converting 31 bytes a
million times takes a fraction of a second on my laptop.
Could you post a short but complete program which demonstrates the
problem?

See http://www.pobox.com/~skeet/csharp/complete.html for details of
what I mean by that.


I read your page and wrote a short program. I'll post again in a while
- I am still investigating. Next time, I will do what you suggest and
write the short test program *before* posting.


Goodo :)
Note that you should really use the return result of Stream.Read to
check how much data has *actually* been read.


Yes, I know. I took out all the error handling to try and speed the
thing up, based on the fact that since I am writing the files and know
whether the write succeeded, the read should work correctly as it is
the mirror of the write.


Not necessarily - there's nothing which guarantees that a Stream.Read
will return all the requested bytes, even if those bytes are actually
present in the stream. While FileStream.Read *probably* always does
this, it might not if it's (say) reading over a network.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 16 '05 #5

P: n/a
> > I read your page and wrote a short program. I'll post again in a
while
- I am still investigating. Next time, I will do what you suggest and write the short test program *before* posting.
Goodo :)


My current best guess is that when I read in the data from the binary
file, the working variables that the data is loaded into is taking so
much memory that some behind-the-scenes memory management is going on
that isn't related to any particular line of code, and the profiler is
adding that time to the execution time of whatever line the application
is on at the time. So now I am on the trail of the memory-gobbling
objects...
Not necessarily - there's nothing which guarantees that a Stream.Read will return all the requested bytes, even if those bytes are actually present in the stream. While FileStream.Read *probably* always does
this, it might not if it's (say) reading over a network.

Thanks, I will bear that in mind.

Richard

Nov 16 '05 #6

This discussion thread is closed

Replies have been disabled for this discussion.