473,394 Members | 1,794 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,394 software developers and data experts.

BinaryReader.ReadBytes issue

Hi,

I am trying to optimize the reading of a huge binary file into a byte[]...

I am doing the following..

byte[] ba = new byte[br.BaseStream.Length];

ba = br.ReadBytes((int)br.BaseStream.Length);

The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that
takes a long?

Chances are I wont reach this problem but the problem will be there none the
less.


Nov 17 '05 #1
29 10994
<<.>> wrote:
I am trying to optimize the reading of a huge binary file into a byte[]...

I am doing the following..

byte[] ba = new byte[br.BaseStream.Length];

ba = br.ReadBytes((int)br.BaseStream.Length);
Why are you allocating an array and then immediately turning it into
garbage?
The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that
takes a long?

Chances are I wont reach this problem but the problem will be there none the
less.


It's certainly not ideal, but I would expect that if you actually had a
file larger than 2Gb, you wouldn't want to be reading it all in in a
single call anyway.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #2

"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP***********************@msnews.microsoft.co m...
<<.>> wrote:
I am trying to optimize the reading of a huge binary file into a byte[]...
I am doing the following..

byte[] ba = new byte[br.BaseStream.Length];

ba = br.ReadBytes((int)br.BaseStream.Length);
Why are you allocating an array and then immediately turning it into
garbage?


Why does ReadBytes reallocate it or copy into it?
The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that takes a long?

Chances are I wont reach this problem but the problem will be there none the less.
It's certainly not ideal, but I would expect that if you actually had a
file larger than 2Gb, you wouldn't want to be reading it all in in a
single call anyway.


Thats the extreme which I wont be anywhere near that range.


--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #3
Hi,
I am trying to optimize the reading of a huge binary file into a byte[]...
Writing huge files into byte[] is not any optimisation.

byte[] ba = new byte[br.BaseStream.Length];
You don't need the initialization of "ba" while you are using
br.ReadBytes(...)
ba = br.ReadBytes((int)br.BaseStream.Length);
This is a bad practise to read unknown stream (unknown size) in
single line.
The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that
takes a long?


Because mostly there's no need to fill memory with huge files.
(files greater than 2 GB)
Think over your design and your needs. If you really want to
read huge files than you can easily get out of memmory!

Marcin
Nov 17 '05 #4
I now do the following... to just be safe on the casting limit.

byte[] ba = new byte[br.BaseStream.Length]; // are you saying this
should be null before and let .ReadBytes allocate it (if it does that?)

if (br.BaseStream.Length <= int.MaxValue)
{
// we are within the casting limits so we can use the optimized
method of reading
ba = br.ReadBytes((int)br.BaseStream.Length);
}
else
{
// we are outside the limits (rare) so we can use the normal way
of reading (slower)
ArrayList b = new ArrayList();
byte readByte = 0x00;
while(br.BaseStream.Position < br.BaseStream.Length)
{
readByte = br.ReadByte();
b.Add(readByte);
}

ba = (byte[])b.ToArray(typeof(byte));
}


"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP***********************@msnews.microsoft.co m...
<<.>> wrote:
I am trying to optimize the reading of a huge binary file into a byte[]...
I am doing the following..

byte[] ba = new byte[br.BaseStream.Length];

ba = br.ReadBytes((int)br.BaseStream.Length);


Why are you allocating an array and then immediately turning it into
garbage?
The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that takes a long?

Chances are I wont reach this problem but the problem will be there none the less.


It's certainly not ideal, but I would expect that if you actually had a
file larger than 2Gb, you wouldn't want to be reading it all in in a
single call anyway.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #5
I need it as a byte[] internally, its being read.
"Marcin Grzębski" <mg*******@taxussi.no.com.spam.pl> wrote in message
news:c1**********@atlantis.news.tpi.pl...
Hi,
I am trying to optimize the reading of a huge binary file into a byte[]...

Writing huge files into byte[] is not any optimisation.

byte[] ba = new byte[br.BaseStream.Length];


You don't need the initialization of "ba" while you are using
br.ReadBytes(...)
ba = br.ReadBytes((int)br.BaseStream.Length);


This is a bad practise to read unknown stream (unknown size) in
single line.
The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes

that takes a long?


Because mostly there's no need to fill memory with huge files.
(files greater than 2 GB)
Think over your design and your needs. If you really want to
read huge files than you can easily get out of memmory!

Marcin

Strange, that you say its not an optimization, it sure runs faster. I guess
you know better than the runtime.
Nov 17 '05 #6

"Marcin Grzębski" <mg*******@taxussi.no.com.spam.pl> wrote in message
news:c1**********@atlantis.news.tpi.pl...
Hi,
I am trying to optimize the reading of a huge binary file into a byte[]...

Writing huge files into byte[] is not any optimisation.

byte[] ba = new byte[br.BaseStream.Length];
You don't need the initialization of "ba" while you are using
br.ReadBytes(...)


fine its byte[] ba = null then.
ba = br.ReadBytes((int)br.BaseStream.Length);
This is a bad practise to read unknown stream (unknown size) in
single line.


Hello, EARTH. BaseStream.Length IS THE SIZE and therefore KNOWN. What do you
propse then genius boy. I need the entire file in memory in a byte[], so how
else would you do it brainiac mr.mensa.
The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that takes a long?


Because mostly there's no need to fill memory with huge files.
(files greater than 2 GB)
Think over your design and your needs. If you really want to
read huge files than you can easily get out of memmory!


ALl i need to do is get the file into memory for another part, that other
part I dont give a rats a.rse about not my problem.

I am talking average of 300 K files

Marcin

Nov 17 '05 #7
.. wrote:
I need it as a byte[] internally, its being read.


I see.
But can't you keep it as a collection of byte[] buffers?
e.g. as an *ArrayList* of byte[] elements with length = 4096

Then you can access those buffers as *ArrayList* items.
Of course buffer length can be set to other value.

Marcin
Nov 17 '05 #8
Its has to be a contigous block. So unless you can put up a better solution
wihtout yackin yer gob off, you can go crap it right up.
"Marcin Grzębski" <mg*******@taxussi.no.com.spam.pl> wrote in message
news:c1**********@atlantis.news.tpi.pl...
. wrote:
I need it as a byte[] internally, its being read.


I see.
But can't you keep it as a collection of byte[] buffers?
e.g. as an *ArrayList* of byte[] elements with length = 4096

Then you can access those buffers as *ArrayList* items.
Of course buffer length can be set to other value.

Marcin

Nov 17 '05 #9
<<.>> wrote:
byte[] ba = new byte[br.BaseStream.Length];

ba = br.ReadBytes((int)br.BaseStream.Length);


Why are you allocating an array and then immediately turning it into
garbage?


Why does ReadBytes reallocate it or copy into it?


Well look at the call - you're not telling it where to read to, it's
returning a reference to a new array.
It's certainly not ideal, but I would expect that if you actually had a
file larger than 2Gb, you wouldn't want to be reading it all in in a
single call anyway.


Thats the extreme which I wont be anywhere near that range.


In which case, it's fine :)

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #10
<<.>> wrote:
I now do the following... to just be safe on the casting limit.

byte[] ba = new byte[br.BaseStream.Length]; // are you saying this
should be null before and let .ReadBytes allocate it (if it does that?)
I'm saying you don't need to assign a value to it at all, as you assign
the value when you've done the read.
if (br.BaseStream.Length <= int.MaxValue)
{
// we are within the casting limits so we can use the optimized
method of reading
ba = br.ReadBytes((int)br.BaseStream.Length);
}
else
{
// we are outside the limits (rare) so we can use the normal way
of reading (slower)
ArrayList b = new ArrayList();
byte readByte = 0x00;
while(br.BaseStream.Position < br.BaseStream.Length)
{
readByte = br.ReadByte();
b.Add(readByte);
}

ba = (byte[])b.ToArray(typeof(byte));
}


No, that second bit isn't a good idea. If you've got a file of over
2Gb, you most certainly *don't* want to create an ArrayList where each
element is a byte read from the file. It would take at least 12 times
the file size - so you'd end up with a memory usage of *at least* 24Gb.
Not pretty.

Do you really want to create an array that is the size of the whole
file, if it's more than 2Gb? I would expect any sane use of such a file
to be either something which can discard the bytes as it reads and
processes them, or something which seeks around within the file. A
safer bet is probably to throw an exception.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #11
Yeah well its a pretty common mistake im sure. *looks around and shuffles
it under the rug*
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
> byte[] ba = new byte[br.BaseStream.Length];
>
> ba = br.ReadBytes((int)br.BaseStream.Length);

Why are you allocating an array and then immediately turning it into
garbage?


Why does ReadBytes reallocate it or copy into it?


Well look at the call - you're not telling it where to read to, it's
returning a reference to a new array.
It's certainly not ideal, but I would expect that if you actually had a file larger than 2Gb, you wouldn't want to be reading it all in in a
single call anyway.


Thats the extreme which I wont be anywhere near that range.


In which case, it's fine :)

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #12
Marcin Grzebski <mg*******@taxussi.no.com.spam.pl> wrote:
I need it as a byte[] internally, its being read.


I see.
But can't you keep it as a collection of byte[] buffers?
e.g. as an *ArrayList* of byte[] elements with length = 4096

Then you can access those buffers as *ArrayList* items.
Of course buffer length can be set to other value.


That would quite possibly make the client code much harder to write.
It's not unreasonable to read the whole of a file as a byte array.
However, I wouldn't do it in the way suggested. I wouldn't use a
BinaryReader at all, in fact. I'd open up a normal stream, and read
blocks into a MemoryStream, then turn the MemoryStream into a byte
array. That way there isn't a problem if the file changes size between
you asking for the size and you reading a file - you just keep reading
blocks until you've finished.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #13
int.MaxValue isnt 2GB, its there just incase thats all and
99.9999999999999999999% wont be hit.
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
I now do the following... to just be safe on the casting limit.

byte[] ba = new byte[br.BaseStream.Length]; // are you saying this should be null before and let .ReadBytes allocate it (if it does that?)


I'm saying you don't need to assign a value to it at all, as you assign
the value when you've done the read.
if (br.BaseStream.Length <= int.MaxValue)
{
// we are within the casting limits so we can use the optimized method of reading
ba = br.ReadBytes((int)br.BaseStream.Length);
}
else
{
// we are outside the limits (rare) so we can use the normal way of reading (slower)
ArrayList b = new ArrayList();
byte readByte = 0x00;
while(br.BaseStream.Position < br.BaseStream.Length)
{
readByte = br.ReadByte();
b.Add(readByte);
}

ba = (byte[])b.ToArray(typeof(byte));
}


No, that second bit isn't a good idea. If you've got a file of over
2Gb, you most certainly *don't* want to create an ArrayList where each
element is a byte read from the file. It would take at least 12 times
the file size - so you'd end up with a memory usage of *at least* 24Gb.
Not pretty.

Do you really want to create an array that is the size of the whole
file, if it's more than 2Gb? I would expect any sane use of such a file
to be either something which can discard the bytes as it reads and
processes them, or something which seeks around within the file. A
safer bet is probably to throw an exception.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #14
This is going into a serializer so it needs to be a byte[] and the files are
average of 300 to 400K in size, its actually packing the data for exporting
and importing across systems.

"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
I now do the following... to just be safe on the casting limit.

byte[] ba = new byte[br.BaseStream.Length]; // are you saying this should be null before and let .ReadBytes allocate it (if it does that?)


I'm saying you don't need to assign a value to it at all, as you assign
the value when you've done the read.
if (br.BaseStream.Length <= int.MaxValue)
{
// we are within the casting limits so we can use the optimized method of reading
ba = br.ReadBytes((int)br.BaseStream.Length);
}
else
{
// we are outside the limits (rare) so we can use the normal way of reading (slower)
ArrayList b = new ArrayList();
byte readByte = 0x00;
while(br.BaseStream.Position < br.BaseStream.Length)
{
readByte = br.ReadByte();
b.Add(readByte);
}

ba = (byte[])b.ToArray(typeof(byte));
}


No, that second bit isn't a good idea. If you've got a file of over
2Gb, you most certainly *don't* want to create an ArrayList where each
element is a byte read from the file. It would take at least 12 times
the file size - so you'd end up with a memory usage of *at least* 24Gb.
Not pretty.

Do you really want to create an array that is the size of the whole
file, if it's more than 2Gb? I would expect any sane use of such a file
to be either something which can discard the bytes as it reads and
processes them, or something which seeks around within the file. A
safer bet is probably to throw an exception.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #15
The file contents wont change at this point that I know.

A stream is a stream memory stream or binaryreader stream, it will take the
same footprint.

MemoryStream.Read still takes an int for the count of the blocks to read
again we have this casting possibility (very rare though).

"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
Marcin Grzebski <mg*******@taxussi.no.com.spam.pl> wrote:
I need it as a byte[] internally, its being read.


I see.
But can't you keep it as a collection of byte[] buffers?
e.g. as an *ArrayList* of byte[] elements with length = 4096

Then you can access those buffers as *ArrayList* items.
Of course buffer length can be set to other value.


That would quite possibly make the client code much harder to write.
It's not unreasonable to read the whole of a file as a byte array.
However, I wouldn't do it in the way suggested. I wouldn't use a
BinaryReader at all, in fact. I'd open up a normal stream, and read
blocks into a MemoryStream, then turn the MemoryStream into a byte
array. That way there isn't a problem if the file changes size between
you asking for the size and you reading a file - you just keep reading
blocks until you've finished.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #16
<<.>> wrote:
int.MaxValue isnt 2GB
Yes it is. To be precise, it's 2,147,483,647, as per the documentation.
its there just incase thats all and
99.9999999999999999999% wont be hit.


So why make it cause grief when you do hit it, instead of cleanly
throwing an exception to say that you can't really cope adequately?

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #17
<<.>> wrote:
The file contents wont change at this point that I know.
In that case, you're fine.
A stream is a stream memory stream or binaryreader stream, it will take the
same footprint.

MemoryStream.Read still takes an int for the count of the blocks to read
again we have this casting possibility (very rare though).


You wouldn't be reading from the MemoryStream though - you'd be calling
ToArray on it to get the bytes back.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #18
Well the debugger wont show int.MaxValue in the watch , didnt think it was
that high.

Yeah I could throw an exception.


"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
int.MaxValue isnt 2GB


Yes it is. To be precise, it's 2,147,483,647, as per the documentation.
its there just incase thats all and
99.9999999999999999999% wont be hit.


So why make it cause grief when you do hit it, instead of cleanly
throwing an exception to say that you can't really cope adequately?

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #19
Is 2GB the largest file size on NTFS? Just curious why the read methods are
limited to int.maxvalue and not long.
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
The file contents wont change at this point that I know.


In that case, you're fine.
A stream is a stream memory stream or binaryreader stream, it will take the same footprint.

MemoryStream.Read still takes an int for the count of the blocks to read
again we have this casting possibility (very rare though).


You wouldn't be reading from the MemoryStream though - you'd be calling
ToArray on it to get the bytes back.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #20

<.> wrote in message news:ej****************@TK2MSFTNGP10.phx.gbl...
Is 2GB the largest file size on NTFS? Just curious why the read methods are limited to int.maxvalue and not long.
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
The file contents wont change at this point that I know.
In that case, you're fine.
A stream is a stream memory stream or binaryreader stream, it will take the
same footprint.

MemoryStream.Read still takes an int for the count of the blocks to
read again we have this casting possibility (very rare though).


You wouldn't be reading from the MemoryStream though - you'd be calling
ToArray on it to get the bytes back.


Just use the FileStream then , no need for memory or binary streams
fs = new FileStream(filename, FileMode.Open);

byte[] ba = null;
if (fs.Length <= int.MaxValue)
{
fs.Read(ba, 0, (int)fs.Length);
}
else
{
throw new InvalidOperationException("File size too large, above "
+ int.MaxValue + " bytes");
}

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too


Nov 17 '05 #21
<<.>> wrote:
Is 2GB the largest file size on NTFS?
I don't think so. I'm pretty sure they're not, actually.
Just curious why the read methods are limited to int.maxvalue and not
long.


Well, as I said, it's very *very* rarely a good idea to read in that
much of a file at a time. If you've got a file that big, you're much
more likely to be scanning through it or seeking into it.

The usage model may well change in the next ten years, of course, as 64
bit systems become more prevalent, and memory and disks become faster
and cheaper.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #22
<<.>> wrote:
Just use the FileStream then , no need for memory or binary streams
fs = new FileStream(filename, FileMode.Open);

byte[] ba = null;
if (fs.Length <= int.MaxValue)
{
fs.Read(ba, 0, (int)fs.Length);
}
else
{
throw new InvalidOperationException("File size too large, above "
+ int.MaxValue + " bytes");
}


This time you *do* need to initialise the array first, because you're
telling Stream.Read where to read into.

However, you're then assuming that FileStream.Read will read everything
you ask it to, which it isn't guaranteed to. Assuming that Stream.Read
reads as much as you've asked it to as a maximum has been the cause of
many a developer getting corrupted data.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #23
Yeah i know, I do that.
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
Just use the FileStream then , no need for memory or binary streams
fs = new FileStream(filename, FileMode.Open);

byte[] ba = null;
if (fs.Length <= int.MaxValue)
{
fs.Read(ba, 0, (int)fs.Length);
}
else
{
throw new InvalidOperationException("File size too large, above " + int.MaxValue + " bytes");
}


This time you *do* need to initialise the array first, because you're
telling Stream.Read where to read into.

However, you're then assuming that FileStream.Read will read everything
you ask it to, which it isn't guaranteed to. Assuming that Stream.Read
reads as much as you've asked it to as a maximum has been the cause of
many a developer getting corrupted data.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #24
And binary reader gurantees its all read at once?
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
Just use the FileStream then , no need for memory or binary streams
fs = new FileStream(filename, FileMode.Open);

byte[] ba = null;
if (fs.Length <= int.MaxValue)
{
fs.Read(ba, 0, (int)fs.Length);
}
else
{
throw new InvalidOperationException("File size too large, above " + int.MaxValue + " bytes");
}


This time you *do* need to initialise the array first, because you're
telling Stream.Read where to read into.

However, you're then assuming that FileStream.Read will read everything
you ask it to, which it isn't guaranteed to. Assuming that Stream.Read
reads as much as you've asked it to as a maximum has been the cause of
many a developer getting corrupted data.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #25
All the streams do not gurantee reading all the maximum count requested so
you are not guranteed any way you chose except testing the retval of the
..Read method to be sure.

<.> wrote in message news:%2*****************@TK2MSFTNGP11.phx.gbl...
And binary reader gurantees its all read at once?
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
Just use the FileStream then , no need for memory or binary streams
fs = new FileStream(filename, FileMode.Open);

byte[] ba = null;
if (fs.Length <= int.MaxValue)
{
fs.Read(ba, 0, (int)fs.Length);
}
else
{
throw new InvalidOperationException("File size too large, above " + int.MaxValue + " bytes");
}


This time you *do* need to initialise the array first, because you're
telling Stream.Read where to read into.

However, you're then assuming that FileStream.Read will read everything
you ask it to, which it isn't guaranteed to. Assuming that Stream.Read
reads as much as you've asked it to as a maximum has been the cause of
many a developer getting corrupted data.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too


Nov 17 '05 #26
<<.>> wrote:
And binary reader gurantees its all read at once?


Yes, although it's not as clear from the documentation as it might be
what the difference between Stream.Read and BinaryReader.ReadBytes is.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #27
MSDN states that binaryreader does not gurantee , just the same as
FileStream doesnt.
"Jon Skeet [C# MVP]" <sk***@pobox.com> wrote in message
news:MP************************@msnews.microsoft.c om...
<<.>> wrote:
And binary reader gurantees its all read at once?


Yes, although it's not as clear from the documentation as it might be
what the difference between Stream.Read and BinaryReader.ReadBytes is.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too

Nov 17 '05 #28
<<.>> wrote:
MSDN states that binaryreader does not gurantee , just the same as
FileStream doesnt.


No, not "just the same" - it only says it will return fewer bytes if it
reaches the end of the stream. You don't have to reach the end of the
stream to have fewer bytes returned to you with Stream.Read. In
practice, I suspect that FileStream.Read *will* always act like
BinaryReader.ReadBytes, but other streams won't (particularly those
over networks), and it's good to avoid getting into the habit of
relying on it.

--
Jon Skeet - <sk***@pobox.com>
http://www.pobox.com/~skeet
If replying to the group, please do not mail me too
Nov 17 '05 #29
Hi,

I find this conversation kinda sad, Jon.
You gave him lots of good information and tips, while he did not want to
listen.
Keep up the good work.

Cheers,

Bram
<.> wrote in message news:u3****************@TK2MSFTNGP09.phx.gbl...
Hi,

I am trying to optimize the reading of a huge binary file into a byte[]...
I am doing the following..

byte[] ba = new byte[br.BaseStream.Length];

ba = br.ReadBytes((int)br.BaseStream.Length);

The problem is., BinaryReader.ReadBytes(...) only takes an int wherase
BinaryReader.BaseStream.Length is a long. Why isnt there a ReadBytes that
takes a long?

Chances are I wont reach this problem but the problem will be there none the less.

Nov 17 '05 #30

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

29
by: | last post by:
Hi, I am trying to optimize the reading of a huge binary file into a byte... I am doing the following.. byte ba = new byte;
6
by: Question with BinaryReader | last post by:
I use BinaryReader to read my binary dafa files, when i call ReadBytes, why it always return more 4 bytes. The following is my code. FileStream fs = new FileStream(file, FileMode.OpenOrCreate,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.