472,125 Members | 1,431 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,125 software developers and data experts.

Using Python/CGI to stream large tar files on the fly??

Hi,

I'm running a CGI script written in python that tars up the photos in user selected directories (via a form). It's running on Apache 1.3.31.1 on Solaris 5.8.

It works well for a small amount of files, but when I try to use it for large archives (typically over 10MB, but this seems to vary) my download stops short. Occasionally I get an error message, this is usually a Broken Pipe (IO Error 32), but sometimes I don't see anything at all (but this is perhaps a flushing issue with stderr on my webspace, perhaps it's because the Broken Pipe is a red herring).

My understanding is that to get a broken pipe the client could close its download request prematurely (definately not the case), or Apache/Python is closing stdout on its side? I get no timeout error in the browser - it just stops dead, thinking the download is complete - the tar is always corrupted and well short of the size I'd expect.

I would have thought that the stdout pipe to the user would have stayed open as long data was being streamed to it? Is it possible that my webspace provider has set some time limit? Could some sort of buffer under/over-run be occurring on the stdout stream?

I've tried various other methods of solving the problem - I've ruled out using zips as I cannot create the objects in memory (using CStringIO or similar) before streaming them as a one complete string due to runtime memory limitations on my webspace (only 30MB as far as I can see!). The only other thing I can think of is to write a temporary file to disk and allow the user to download this - again, not practical as diskspace is limited, and is a bit ugly too.

The function causing the problem is below, it's pretty self explantory. If anyone has any ideas what might be causing the problem I'd be very greatful - it's driving me round the bend!

Thanks,

Phil.


def download( folders ):

print "Content-type: application/x-tar"
print "Content-Disposition: attachment; filename=\"Download.tar\"\n"

parentZipFile = tarfile.open( '', "w", sys.stdout )

#signal.signal( signal.SIGPIPE, signal.SIG_DFL )

for folder in folders:

photoDir = os.path.join( folder, "photos" )
if os.path.isdir( photoDir ):

# We have photos!
photos = glob.glob( photoDir + "/*.jpg" )
photos += glob.glob( photoDir + "/*.JPG" )
for photo in photos:

parentZipFile.add( photo, string.join( photo.split( "/" )[-3:], "/" ) )
parentZipFile.close()
May 29 '07 #1
2 3046
To clarify this is definately not a *byte* limit on streaming.

From my nice fast connection at work, I have no problem downloading 54MB for example, but if I crank it up to a larger 150-odd MB tar it falls over in exactly the same way.

With larger archives I can see as it tries to extract that the initial decompression goes fine - and from what I can see the end of the file is hit unexpectadly.

This does look like a timeout on the Apache server.

I was wondering if anyone could clarify, and perhaps suggest a workaround?

Thanks again,

Phil.
May 30 '07 #2
bartonc
6,596 Expert 4TB
Can't help with CGI scripts. Can help with posting code:
We use [code] tags that will maintain the indentation of your code.
Great work-around on your part, using nested[indent] tags, though.
It's all right there, on the right hand side of the page when posting or replying: 4 little things to keep in mind in * GUIDELINES...
May 30 '07 #3

Post your reply

Sign in to post your reply or Sign up for a free account.

Similar topics

reply views Thread by Richard Taylor | last post: by
6 posts views Thread by syed javid | last post: by
12 posts views Thread by nuttydevil | last post: by
5 posts views Thread by Michael Sperlle | last post: by
7 posts views Thread by Chi Yin Cheung | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.