471,336 Members | 956 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,336 software developers and data experts.

Download problem

Hello,
I'm using:
Apache/2.0.54 (Win32) mod_ssl/2.0.54 OpenSSL/0.9.7g PHP/5.0.4
MySQL 4.1.12
DocMan1.3RC1 (PHP based document manager)

I'm having following problem: file uploads OK, but problem occurs when I try to download or view the file.
After some innvestigation, I found out that only first 2MB od data is actually downloaded (exactly 2.000.000
bytes) resulting in corrupted files. (uploaded files are OK!)
I couldn't find that limit anywhere in my system. If the original filesize is under 2M, files are downloaded
and/or viewed correctly.
Firefox, Opera, IE6 - they all behave in the same way.
Win98, 2000, XP - the same

If I manualy fetch the files (bypassing DOCman), downloads is OK!

What am I missing here? I'm pretty sure it's PHP related, but I'm running out of ideas where to look.
Pls. help,
//Marin

Jul 17 '05 #1
3 1561
On Wed, 29 Jun 2005 09:04:35 +0200, Marin wrote:
Hello,
I'm using:
Apache/2.0.54 (Win32) mod_ssl/2.0.54 OpenSSL/0.9.7g PHP/5.0.4 MySQL 4.1.12
DocMan1.3RC1 (PHP based document manager)

I'm having following problem: file uploads OK, but problem occurs when I
try to download or view the file. After some innvestigation, I found out
that only first 2MB od data is actually downloaded (exactly 2.000.000
bytes) resulting in corrupted files. (uploaded files are OK!) I couldn't
find that limit anywhere in my system. If the original filesize is under
2M, files are downloaded and/or viewed correctly.
Firefox, Opera, IE6 - they all behave in the same way. Win98, 2000, XP -
the same

If I manualy fetch the files (bypassing DOCman), downloads is OK!

What am I missing here? I'm pretty sure it's PHP related, but I'm running
out of ideas where to look. Pls. help,
//Marin


Could this be a limit imposed by your ISP rather than your code? With it
being a solid 2000000 it seems possible. I haven't come across a php limit
communicating data across two servers in my work place and these
streams are much larger than 2M. I did this as an experiment to see if I
could come up with a better way to update remote servers, so I was working
with G byte tar-zip balls. (turned out to be far too slow for this purpose).

If it is that, I was going to suggest a way of breaking the file into
blocks with the file functions, but I am not sure how an ISP counts these
limits, it could be a limit in a single stream or a limit within a time
period.
Jul 17 '05 #2

"BearItAll" <sp**@rassler.co.uk> wrote in message news:pa****************************@rassler.co.uk. ..
Could this be a limit imposed by your ISP rather than your code?
Actually, it is not live system. I'm running it on local LAN. Thats what frustrates me even more.
It would be so much easier to blame it on ISP :) but since I'm responsible person in company....
With it
being a solid 2000000 it seems possible.

My thought exactly!

Thanx anyway
//Marin
Jul 17 '05 #3
Marin wrote:
Hello,
I'm using:
Apache/2.0.54 (Win32) mod_ssl/2.0.54 OpenSSL/0.9.7g PHP/5.0.4
MySQL 4.1.12
DocMan1.3RC1 (PHP based document manager)

I'm having following problem: file uploads OK, but problem occurs when I
try to download or view the file. After some innvestigation, I found out
that only first 2MB od data is actually downloaded (exactly 2.000.000
bytes) resulting in corrupted files. (uploaded files are OK!) I couldn't
find that limit anywhere in my system. If the original filesize is under
2M, files are downloaded and/or viewed correctly. Firefox, Opera, IE6 -
they all behave in the same way. Win98, 2000, XP - the same

If I manualy fetch the files (bypassing DOCman), downloads is OK!

What am I missing here? I'm pretty sure it's PHP related, but I'm running
out of ideas where to look. Pls. help,
//Marin


How do you download the file? Do you just go to the file directly, or do
you use something like "getfile.php?id=yourfilehere". If you use the
second method, make shure you don't try to read the file-contents into a
variable before displaying it:

$content = join( "\n", file( $fileLocation ) );

because this will cause errors with the php memory usage. Use fopen and
fread instead to read small pieces and then print them reusing the same
variable.
Or if you use an output buffer, make shure it doesn't grow to big...

If you use the first method... I have no idea.

Rutger
--
Rutger Claes rg*@rgc.tld
Replace tld with top level domain of belgium to contact me pgp:0x3B7D6BD6
Do not reply to the from address. It's read by /dev/null and sa-learn only

Jul 17 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

12 posts views Thread by Swede | last post: by
7 posts views Thread by Brian Paul | last post: by
4 posts views Thread by hoenes1 | last post: by
1 post views Thread by a.r.austin | last post: by
3 posts views Thread by tshad | last post: by
reply views Thread by rosydwin | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.