By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,347 Members | 1,390 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,347 IT Pros & Developers. It's quick & easy.

REQ: Backup of website

P: n/a
Hello

I am seeking after a backup script, to take backup of my webspace on
my server, my server is www.servage.net i need to take backup 2 times
per day. Via cronjob on www.cronjob.de but i have seeking i 2 days
without reslut.

Where can i find some webspace / serverspace backup scripts?

Kind redgards
Henrik

May 13 '07 #1
Share this Question
Share on Google+
3 Replies


P: n/a
On May 13, 2:48 pm, mrwheel <henrik.larse...@gmail.comwrote:
Hello

I am seeking after a backup script, to take backup of my webspace on
my server, my server iswww.servage.neti need to take backup 2 times
per day. Via cronjob onwww.cronjob.debut i have seeking i 2 days
without reslut.

Where can i find some webspace / serverspace backup scripts?

Kind redgards
Henrik
what control have you got over the server. Any large backup will take
CPU time, and memory. If you had cron access I would suggest using
zip.
You might be able to find out what functions are on your system, and
use on of them to recursively zip up a website.

for instance try
<?php
$files = file_get_contents('tobezipped.txt');
system('minizip -o -9 pdfs.zip '.$files);
exit();
?>
where tobezipped.txt is a space separated list of double quoted files.

May 13 '07 #2

P: n/a
In article <11**********************@p77g2000hsh.googlegroups .com>,
shimmyshack <ma********@gmail.comwrote:
>what control have you got over the server. Any large backup will take
CPU time, and memory. If you had cron access I would suggest using
zip.
You might be able to find out what functions are on your system, and
use on of them to recursively zip up a website.

for instance try
<?php
$files = file_get_contents('tobezipped.txt');
system('minizip -o -9 pdfs.zip '.$files);
exit();
?>
where tobezipped.txt is a space separated list of double quoted files.
Good idea. Even better would be in that system() call:

system('nice -10 minizip -o -9 pdfs.zip '.$files);

The 'nice' command causes the task to be run in 'nice' mode at
a lower priority so that it doesn't interfere with other more
important functions being performed by the server.

Appending ' &' to the command may also force it to run in the
background so your php script doesn't have to wait for it to
complete:
system('nice -10 minizip -o -9 pdfs.zip '.$files.' &');
....but I am unsure if that will work.

-A
May 13 '07 #3

P: n/a
On May 13, 11:27 pm, a...@spamcop.net (axlq) wrote:
In article <1179089807.470490.226...@p77g2000hsh.googlegroups .com>,

shimmyshack <matt.fa...@gmail.comwrote:
what control have you got over the server. Any large backup will take
CPU time, and memory. If you had cron access I would suggest using
zip.
You might be able to find out what functions are on your system, and
use on of them to recursively zip up a website.
for instance try
<?php
$files = file_get_contents('tobezipped.txt');
system('minizip -o -9 pdfs.zip '.$files);
exit();
?>
where tobezipped.txt is a space separated list of double quoted files.

Good idea. Even better would be in that system() call:

system('nice -10 minizip -o -9 pdfs.zip '.$files);

The 'nice' command causes the task to be run in 'nice' mode at
a lower priority so that it doesn't interfere with other more
important functions being performed by the server.

Appending ' &' to the command may also force it to run in the
background so your php script doesn't have to wait for it to
complete:
system('nice -10 minizip -o -9 pdfs.zip '.$files.' &');
...but I am unsure if that will work.

-A
I suppose it might be worth mentioning the old fashioned way - good
old FTP.
Filezilla running at your house with a cron job to go fetch
everything.
PHP running at your house with a cron to start your custom script up
and use ftp to grab everything.

May 14 '07 #4

This discussion thread is closed

Replies have been disabled for this discussion.