471,089 Members | 1,123 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,089 software developers and data experts.

Why Would ResponseStream().Read return zero bytes when not at end offile?

Greetings,

I have an app that uses HttpWebRequest to download a rather large file
(100 MB). I have a while loop set up so that it reads 4096 bytes at a
time from the ResponseStream() until it reads zero bytes. The odd
thing is that for some of the people using my app, the Read() function
from the ResponseStream() will return zero bytes even though it has
not read the enter ResponseStream(). I know it hasn't read the entire
stream because I do a ResponseStream().ContentLength and get the size
of the stream (and record it) before I start reading data, then I also
record how many bytes I've read.

Any idea why this would be happening? And any suggestions on how to
get around?
Jan 11 '08 #1
4 3400
Rymfax wrote:
I have an app that uses HttpWebRequest to download a rather large file
(100 MB). I have a while loop set up so that it reads 4096 bytes at a
time from the ResponseStream() until it reads zero bytes. The odd
thing is that for some of the people using my app, the Read() function
from the ResponseStream() will return zero bytes even though it has
not read the enter ResponseStream(). I know it hasn't read the entire
stream because I do a ResponseStream().ContentLength and get the size
of the stream (and record it) before I start reading data, then I also
record how many bytes I've read.

Any idea why this would be happening? And any suggestions on how to
get around?
It sounds weird.

The docs are rather clear:

#Return Value
#The total number of bytes read into the buffer. This can be less than
#the number of bytes requested if that many bytes are not currently
#available, or zero (0) if the end of the stream has been reached.

Is the file in question public available so we can try ?

Arne
Jan 11 '08 #2
I don't know why it is dropping ,but if the web-server supports it,
you could attempt to resume by adding a RANGE header to the request
and re-sending? (i.e. from byte {large number} onwards...)

Marc
Jan 11 '08 #3
Would the amount that you actually can download correspond to about four
megabytes?

--
--
Bob Powell [MVP]
Visual C#, System.Drawing

Ramuseco Limited .NET consulting
http://www.ramuseco.com

Find great Windows Forms articles in Windows Forms Tips and Tricks
http://www.bobpowell.net/tipstricks.htm

Answer those GDI+ questions with the GDI+ FAQ
http://www.bobpowell.net/faqmain.htm

All new articles provide code in C# and VB.NET.
Subscribe to the RSS feeds provided and never miss a new article.
"Rymfax" <cw*****@bigbangllc.comwrote in message
news:c6**********************************@k39g2000 hsf.googlegroups.com...
Greetings,

I have an app that uses HttpWebRequest to download a rather large file
(100 MB). I have a while loop set up so that it reads 4096 bytes at a
time from the ResponseStream() until it reads zero bytes. The odd
thing is that for some of the people using my app, the Read() function
from the ResponseStream() will return zero bytes even though it has
not read the enter ResponseStream(). I know it hasn't read the entire
stream because I do a ResponseStream().ContentLength and get the size
of the stream (and record it) before I start reading data, then I also
record how many bytes I've read.

Any idea why this would be happening? And any suggestions on how to
get around?
Jan 11 '08 #4
On Fri, 11 Jan 2008 13:06:26 -0800, Rymfax <cw*****@bigbangllc.comwrote:
[...]
It is possible that it is reading zero bytes because the connection is
closing prematurely, but this is suppose to generate an exception
That's not true. You'll get an exception if the connection is _reset_,
but if it's simply closed then you get a 0 byte receive.
which I check for (I actually check for any exception).
So I don't
know what could be causing the ResponseStream to return zero bytes if
it hasn't reached the end of the content (according to the
ContentLength anyway).
As I said, there is no network requirement that the number of bytes
delivered by the server before closing the connection is identical to the
number of bytes the server advertised in the HTTP response. The
advertised length is just that: an advertisement. The network side of
things doesn't enforce it.

One would expect a correctly-behaving HTTP server to provide the bytes as
advertised, but servers don't always behave, and on top of that some ISPs
are known to mess with TCP connections, including doing something like
closing it for no good reason.
I've already coded in a re-check for when this
happens, but I was hoping to figure out WHY it is happening.
That's an admirably goal. But the only way you can do that is by doing
some debugging. You need to monitor the network traffic at both ends and
figure out who is closing the connection. Once you've identified that,
then you can look at why. Unfortunately, a vague problem description is
not enough information for anyone else in this newsgroup to answer the
question. Only first-hand knowledge will do.

Pete
Jan 12 '08 #5

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

2 posts views Thread by Brent Rogers | last post: by
64 posts views Thread by Robert Seacord | last post: by
7 posts views Thread by =?Utf-8?B?aXdkdTE1?= | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.