472,378 Members | 1,219 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,378 software developers and data experts.

Using Python/CGI to stream large tar files on the fly with Apache??

Hi,

First of all sorry for the double post - this is on the Python page too, but as far as I can see this is an Apache issue now. Mods - feel free to delete the similar titled posts from me on Python, if this is the case (can't seem to do it myself!).

Anyway,

I'm running a CGI script written in python that tars up the photos in user selected directories (via a form). It's running on Apache 1.3.31.1 on Solaris 5.8.

It works well for a small amount of files, but when I try to use it for large archives (typically over 10MB, but this seems to vary) my download stops short. Occasionally I get an error message, this is usually a Broken Pipe (IO Error 32), but sometimes I don't see anything at all (but this is perhaps a flushing issue with stderr on my webspace, perhaps it's because the Broken Pipe is a red herring).

My understanding is that to get a broken pipe the client could close its download request prematurely (definately not the case), or Apache/Python is closing stdout on its side? I get no timeout error in the browser - it just stops dead, thinking the download is complete - the tar is always corrupted and well short of the size I'd expect.

I would have thought that the stdout pipe to the user would have stayed open as long data was being streamed to it? Is it possible that my webspace provider has set some time limit? Could some sort of buffer under/over-run be occurring on the stdout stream?

I've tried various other methods of solving the problem - I've ruled out using zips as I cannot create the objects in memory (using CStringIO or similar) before streaming them as a one complete string due to runtime memory limitations on my webspace (only 30MB as far as I can see!). The only other thing I can think of is to write a temporary file to disk and allow the user to download this - again, not practical as diskspace is limited, and is a bit ugly too.

The function causing the problem is below, it's pretty self explantory. If anyone has any ideas what might be causing the problem I'd be very greatful - it's driving me round the bend!

Thanks,

Phil.

Expand|Select|Wrap|Line Numbers
  1. def download( folders ):
  2.  
  3.  
  4.     print "Content-type: application/x-tar"
  5.     print "Content-Disposition: attachment; filename=\"Download.tar\"\n"
  6.  
  7.     parentZipFile = tarfile.open( '', "w", sys.stdout )
  8.  
  9.     #signal.signal( signal.SIGPIPE, signal.SIG_DFL )
  10.  
  11.     for folder in folders:
  12.  
  13.  
  14.         photoDir = os.path.join( folder, "photos" )
  15.         if os.path.isdir( photoDir ):
  16.  
  17.  
  18.             # We have photos!
  19.             photos = glob.glob( photoDir + "/*.jpg" )
  20.             photos += glob.glob( photoDir + "/*.JPG" )
  21.             for photo in photos:
  22.  
  23.  
  24.                 parentZipFile.add( photo, string.join( photo.split( "/" )[-3:], "/" ) )
  25.  
  26.     parentZipFile.close()
  27.  
  28.  
__________
Second message from original post:


To clarify this is definately not a *byte* limit on streaming.

From my nice fast connection at work, I have no problem downloading 54MB for example, but if I crank it up to a larger 150-odd MB tar it falls over in exactly the same way.

With larger archives I can see as it tries to extract that the initial decompression goes fine - and from what I can see the end of the file is hit unexpectadly.

This does look like a timeout on the Apache server. Is there one specific for CGI scripts or will this be the universal TimeOut in the httpd.conf file?

I was wondering if anyone could clarify, and perhaps suggest a workaround - as the page is running on a managed webserver, I'm not likely to be able to demand them to change defaults that effect all the users (I'm probably not that persuavive and I appreicate some of the limits are set with good reason).


Thanks again,

Phil.
May 30 '07 #1
3 8372
Motoma
3,237 Expert 2GB
I hope you don't mind, I am going to ask some questions which you may have already answered (just to clarify in my mind).

My understanding of the situation (please correct me if I am wrong):
Your program is creating the headers, tar'ing the file, then printing the tar'ed data to stdio.
The size of the file that causes an error is dependent on the speed of the connection.

My questions:
How are you receiving the error messages?
You alluded to the fact that the broken pipe error was not the only one you receive; what are the others?
Does the Apache log give any clues as to why things might be shutting down?

Typically, web hosting companies will put restrictions on how much memory a process is allowed to take up. I believe the directive for this is RLimitMEM. If what I have previously stated is true, this may be the cause of your problem: a slow connection might be causing the ouput buffer to exceed the upper memory limit. What you may want to try is writing the tar to a file and then performing a header redirect. Using the command line for tar may work better in this situation, as it likely will avoid the memory limit issue you seem to be having.
May 30 '07 #2
I hope you don't mind, I am going to ask some questions which you may have already answered (just to clarify in my mind).
No not at all - thanks for the reply and see my answers below:

My understanding of the situation (please correct me if I am wrong):
Your program is creating the headers, tar'ing the file, then printing the tar'ed data to stdio.
The size of the file that causes an error is dependent on the speed of the connection.
Spot on I'm just printing a standard tar HTTP response header to stdout followed by a tar file that is being streamed to stdout on the fly, adding each jpeg file at a time.

The amout of data I ultimately receive at the client end is dependant on the speed of the connection, rather than the amount of data I transfer. Depending on the connection speed I have, there a threshold over which I get an incomplete and thus invalid tar.

My questions:
How are you receiving the error messages?
You alluded to the fact that the broken pipe error was not the only one you receive; what are the others?
Does the Apache log give any clues as to why things might be shutting down?
This is a bit ugly but I'm restricted by using web hosting to illustrate the problem - the script works fine trying it from a setup of Apache at home using localhost (of course the speeds are huge then too because there is no network transfer to speak of).
The downside of this is I cannot get my hands on the access_log and error_log files to see exactly what is going on. I have been piping stderr to a file at the start of my CGI script that lives in my webspace, which I can then look through on a browser.
When I look at this post-failure I have seen Python's IO Error 32 raised - which is a broken pipe. However sometimes I seem to capture nothing, I've just tried again now and I see an incomplete list of filenames (I also pipe these to stderr for logging) and then.... nothing. I'm reluctant to explain this, but perhaps I can by Apache raising SIGPIPE, SIGTERM, SIGKILL (all in quick succession) before my stderr is flushed to the file when I hit a preallocated time limit. But I am speculating now.
I've made the file I'm piping stderr to have a 0 length buffer, but of course this doesn't change the size of the stderr buffer itself. If I catch any exception I may be able to through so generic exception information out, flush it and then reraise the exception (granted I haven't tried this), but what I really need to see is python's last barf as python intended me to see it.

I do know from trying other methods that if I was getting an out of memory error I would expect Python to specifically raise this and not the broken pipe - at least I saw this frequently when hitting a 30MB memory limit when I was trying to originally create a zip object in memory to stream as one big string to the client.


Typically, web hosting companies will put restrictions on how much memory a process is allowed to take up. I believe the directive for this is RLimitMEM. If what I have previously stated is true, this may be the cause of your problem: a slow connection might be causing the ouput buffer to exceed the upper memory limit. What you may want to try is writing the tar to a file and then performing a header redirect. Using the command line for tar may work better in this situation, as it likely will avoid the memory limit issue you seem to be having.
This sounds feasible. Writing the tar file to my local filespace is no problem - are you suggesting I then shell an unix command to pipe this to stdout. Don't think I've ever tried a header redirect?

Thanks again for your help,

Phil
May 30 '07 #3
Motoma
3,237 Expert 2GB
Now that is one thing I hadn't thought of: page execution timeout. I don't know off the top of my head what setting you could check for this, but you could probably Google it as easily as I can. You may want to try a few timing tests with different network connection speeds, paying attention to how long it takes before the transfer crashes. If you find it to be a consistent number (300 seconds is a common one), you may have found your problem.

I believe all you would need to do is this (pseudo code of course):
Expand|Select|Wrap|Line Numbers
  1. import os
  2.  
  3. os.system("tar -czf outfile.tar.gz /path/to/my/jpegs/*.jpg")
  4. print "Location: outfile.tar.gz\n"
  5.  
  6.  
May 31 '07 #4

Sign in to post your reply or Sign up for a free account.

Similar topics

3
by: Ricardo Sanchez | last post by:
Hello, I'm trying to upload images to http://imageshac.us via a Python script. I have looked at the POST request with HTTPLiveHeaders Firefox extension when I upload an image, but I can't...
0
by: Richard Taylor | last post by:
User-Agent: OSXnews 2.07 Xref: number1.nntp.dca.giganews.com comp.lang.python:437315 Hi I am trying to use py2app (http://undefined.org/python/) to package a gnome-python application...
5
by: Michael Sperlle | last post by:
Is it possible? Bestcrypt can supposedly be set up on linux, but it seems to need changes to the kernel before it can be installed, and I have no intention of going through whatever hell that would...
113
by: John Nagle | last post by:
The major complaint I have about Python is that the packages which connect it to other software components all seem to have serious problems. As long as you don't need to talk to anything outside...
0
by: VeeraLakshmi | last post by:
I am doing a project for internet control using Java,PHP and MySql.All sites should go through the proxy server only.We are giving access rights as allow or deny to the sites.If we type the...
122
by: C.L. | last post by:
I was looking for a function or method that would return the index to the first matching element in a list. Coming from a C++ STL background, I thought it might be called "find". My first stop was...
2
by: falloutphil | last post by:
Hi, I'm running a CGI script written in python that tars up the photos in user selected directories (via a form). It's running on Apache 1.3.31.1 on Solaris 5.8. It works well for a small...
3
by: Rajendran | last post by:
Hi all, I've installed pyodbc module to access my database (MS Access). I've setup a User level DSN to the database.mdb file. When I run my python code in the command prompt it is retrieving the...
5
by: Chuck Anderson | last post by:
I run Apache 2.0.55, and Php (both 4.4.1 and 5.2.5) on my home PC (Windows XP). One of the scripts that I run daily needs to access a secure URL (https://..............). When I am running Php4,...
2
by: Kemmylinns12 | last post by:
Blockchain technology has emerged as a transformative force in the business world, offering unprecedented opportunities for innovation and efficiency. While initially associated with cryptocurrencies...
0
by: antdb | last post by:
Ⅰ. Advantage of AntDB: hyper-convergence + streaming processing engine In the overall architecture, a new "hyper-convergence" concept was proposed, which integrated multiple engines and...
0
Oralloy
by: Oralloy | last post by:
Hello Folks, I am trying to hook up a CPU which I designed using SystemC to I/O pins on an FPGA. My problem (spelled failure) is with the synthesis of my design into a bitstream, not the C++...
0
BLUEPANDA
by: BLUEPANDA | last post by:
At BluePanda Dev, we're passionate about building high-quality software and sharing our knowledge with the community. That's why we've created a SaaS starter kit that's not only easy to use but also...
0
by: Rahul1995seven | last post by:
Introduction: In the realm of programming languages, Python has emerged as a powerhouse. With its simplicity, versatility, and robustness, Python has gained popularity among beginners and experts...
2
by: Ricardo de Mila | last post by:
Dear people, good afternoon... I have a form in msAccess with lots of controls and a specific routine must be triggered if the mouse_down event happens in any control. Than I need to discover what...
1
by: Johno34 | last post by:
I have this click event on my form. It speaks to a Datasheet Subform Private Sub Command260_Click() Dim r As DAO.Recordset Set r = Form_frmABCD.Form.RecordsetClone r.MoveFirst Do If...
0
by: jack2019x | last post by:
hello, Is there code or static lib for hook swapchain present? I wanna hook dxgi swapchain present for dx11 and dx9.
0
by: F22F35 | last post by:
I am a newbie to Access (most programming for that matter). I need help in creating an Access database that keeps the history of each user in a database. For example, a user might have lesson 1 sent...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.