"Shabam" <ch*****@yomama-nospam.com> writes:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.
Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.
My directory structure is like this:
/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...
User account names reside in those folders, so user jason would be in
"Users/j/jason".
Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs, and it won't allow me to
restore accounts individually.
Firstly, if your not a perl programmer, why do you plan to use perl
for this task? This could easily be done with just a bash script.
Secondly, your statement about not being able to extract individual
account data from a single tar file is incorrect. You can extract
individual files or groups of files from a tar archive.
The basic building blocks for your script are two loops. The outer
loop goes through the outer list of directories and for each of those,
the inner loop goes through the user accounts in each directory and
processes them in whatever way you want.
The perl functions you probably want are opendir and readdir. Try
perldoc -f readdir, but to be honest, if your not a perl programmer,
save yourself time and just use bash (unless you want to learn perl).
Tim
--
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you
really need to send mail, you should be able to work it out!