By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,886 Members | 1,123 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,886 IT Pros & Developers. It's quick & easy.

running an exec in the background, let page go on.

P: n/a
I've a PHP script that does some stuff, zips up some files, and starts
an FTP process.

I need it to do the Zip and FTP in the background and let the page
finish, otherwise the user could have that page sitting there loading
for 20 minutes while the FTP completes.

Here's what I have:

$zipfile = $order_po.".zip";
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status);
echo "<div align='center' class='style1_hd'>Your auto-created ZIP file
is currently being sent through FTP!</div></body></html>";
exit();
?>

(The ftp.sh is the FTP logon and transfer script.)
As you can see I have the & to make it a background process, and
sending the stdoutput elsewhere.

Any ideas?

Thanks!
Liam

Apr 27 '06 #1
Share this Question
Share on Google+
11 Replies


P: n/a
Ksu
run it in new window

Apr 27 '06 #2

P: n/a
Ksu wrote:
run it in new window

I'm sorry? Run it in a new window?
As in a new browser window?
That doesn't really solve the problem, just moves the problem to
another browser window which must likewise remain open for 10 to 40
minutes while the FTP completes.

Maybe there is no answer. =/
-Liam

Apr 27 '06 #3

P: n/a
ne**@celticbear.com wrote:
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status); Why doesn't this work?
As you can see I have the & to make it a background process, and
sending the stdoutput elsewhere.

Try pcntl_fork().

Apr 27 '06 #4

P: n/a

Sjoerd wrote:
ne**@celticbear.com wrote:
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status);

Why doesn't this work?


I have no idea, which is why I'm asking here. =)
As you can see I have the & to make it a background process, and
sending the stdoutput elsewhere.

Try pcntl_fork().


Evidently that doesn't work when PHP is an Apache module, which is the
case for me. =/

Thanks for the reply! =)
-Liam

Apr 27 '06 #5

P: n/a

ne**@celticbear.com wrote:
Maybe there is no answer. =/


Maybe you can try something that isn't pure PHP. Have you considered
AJAX? When the page loads, you can send out an asynchronous call to
the server which would have your process in a seperate php file. That
php file could post back progress to the original page at intervals so
that you could display progress to the user without blocking the data
transfer.

Joseph

Apr 27 '06 #6

P: n/a
ne**@celticbear.com wrote:
I've a PHP script that does some stuff, zips up some files, and starts
an FTP process.

I need it to do the Zip and FTP in the background and let the page
finish, otherwise the user could have that page sitting there loading
for 20 minutes while the FTP completes.

Here's what I have:

$zipfile = $order_po.".zip";
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status);
echo "<div align='center' class='style1_hd'>Your auto-created ZIP file
is currently being sent through FTP!</div></body></html>";
exit();
?>

(The ftp.sh is the FTP logon and transfer script.)
As you can see I have the & to make it a background process, and
sending the stdoutput elsewhere.

Any ideas?

Thanks!
Liam


Im not quite sure if it is the neatest solution but for these kind of
operations I create a todo file/table/etc and have crontab check the
file every x mins and then execute the a script based on the parameters
given in the todo file. You can catch the output and store it wherever u
like.

Arjen

Apr 28 '06 #7

P: n/a
On Thu, 27 Apr 2006 09:33:08 -0700, Sjoerd wrote:
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status);

Why doesn't this work?


Because the stdout handle will still point to the handle the Apache/PHP
module is listening/blocking for input on.

I posted more about this in my recent thread:

Executing PHP Tasks While Letting A User Continue To Browse?
http://nextgen.url123.com/backgroundphp
(Google groups link for those of you that can't find it in your newsreader)

Cheers,
Andy

--
Andy Jeffries MBCS CITP ZCE | gPHPEdit Lead Developer
http://www.gphpedit.org | PHP editor for Gnome 2
http://www.andyjeffries.co.uk | Personal site and photos

Apr 28 '06 #8

P: n/a
On Fri, 28 Apr 2006 08:07:13 +0000, Andy Jeffries wrote:
On Thu, 27 Apr 2006 09:33:08 -0700, Sjoerd wrote:
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status);

Why doesn't this work?


Because the stdout handle will still point to the handle the Apache/PHP
module is listening/blocking for input on.

I posted more about this in my recent thread:


Sorry, wrong thread in the earlier message, I meant this one:

Daemonising to continue asynchronously
http://nextgen.url123.com/asyncphp
(Google groups, blah, blah).

Cheers,
Andy
--
Andy Jeffries MBCS CITP ZCE | gPHPEdit Lead Developer
http://www.gphpedit.org | PHP editor for Gnome 2
http://www.andyjeffries.co.uk | Personal site and photos

Apr 28 '06 #9

P: n/a
On Thu, 27 Apr 2006 09:23:35 -0700, ne**@celticbear.com wrote:
run it in new window

I'm sorry? Run it in a new window?
As in a new browser window?
That doesn't really solve the problem, just moves the problem to another
browser window which must likewise remain open for 10 to 40 minutes
while the FTP completes.

Maybe there is no answer. =/


Don't lose hope, you definitely can do it. As per my earlier pointer, you
could do it by execing another PHP script that detaches from the parent
process and closes stdout (I posted code in that thread to do it) then
take your long action from there.

Another advantage is you don't need to worry about trying to use & to make
your .sh exec go in the background from that PHP script, so you could
print errors from the .sh, capture them in the PHP script and mail them to
you if you like or log them.

Cheers,
Andy

--
Andy Jeffries MBCS CITP ZCE | gPHPEdit Lead Developer
http://www.gphpedit.org | PHP editor for Gnome 2
http://www.andyjeffries.co.uk | Personal site and photos

Apr 28 '06 #10

P: n/a
have a wrapper script which you call via your php script..

in there have something like this:

***
nohup var/www/html/services/ftp.sh $zipfile &
***
This way the process will detatch itself from the script, let the
script complete and all id dandy..

Apr 28 '06 #11

P: n/a
In <11**********************@g10g2000cwb.googlegroups .com>,
"ne**@celticbear.com" <ne**@celticbear.com> mentions:
I need it to do the Zip and FTP in the background and let the page
finish, otherwise the user could have that page sitting there loading
for 20 minutes while the FTP completes.

Here's what I have:

$zipfile = $order_po.".zip";
exec("/var/www/html/services/ftp.sh $zipfile 2>&1 &", $output2,
$status);
echo "<div align='center' class='style1_hd'>Your auto-created ZIP file
is currently being sent through FTP!</div></body></html>";
exit();
?>


If it's the kind of thing that DOESN'T need to be distributed and
it it's a UNIX based host..

I would take a page from the cron suggestion, but, attempt to run it
through the 'batch' command. (On some hosts, batch is run from cron, the
two are kind of related)

Batch will handle all the nitty gritty about emailing you when it's done, etc..
Plus, it's nice to the system, waiting for a time when the machine isn't doing
a whole lot to run the commands. I use batch a LOT for this kind of thing.
(forking processes that need to "go away" when done) Easy on the system, no
worries about zombie processes or signal handlers, etc...

Problem: Batch might not like to be run as user 'nobody' so you
may have to pull some tricks to get around that. This is doable
with set-ID, but, it's less then ideal.

Unfortunately, batch one of those commands that is not well utilized.

If you can get it to work, batch is by far the best, since it won't be
connected to the web server in any way.

Lets hear it for batch! :-)

If it does need to be distributed or batch won't work, you can do
it in perl, and call on the fork() method (after closing stdio)

Or, just redirect stdio to and from /dev/null Forking in apache
is tricky business, especially when mysql handles are open and
such.

Jamie
--
http://www.geniegate.com Custom web programming
gu******@lnubb.pbz (rot13) User Management Solutions
May 1 '06 #12

This discussion thread is closed

Replies have been disabled for this discussion.