469,338 Members | 8,242 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,338 developers. It's quick & easy.

Feed a directory listing to a script

I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
.... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".

Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs, and it won't allow me to
restore accounts individually.

Thanks for any help.
Aug 8 '05 #1
8 6534
Shabam <ch*****@yomama-nospam.com> wrote in comp.lang.perl.misc:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".
So what have you tried so far, and how does it fail?
Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs,
So?
and it won't allow me to
restore accounts individually.


Ah, but it does. The problem is, you'd have to read through the entire
tar file, but you can restore any selection of files you want.

Anno
--
If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers.
Aug 8 '05 #2
Shabam wrote:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Here is a hint:

perl -le 'while(($user,$pw,$uid,$gid,$q,$c,$name,$home)=get pwent){print
"~$user = $home for $name" if $uid > 100}'
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.


OK, here's another hint. Replace the print() part with this:

system "tar cvf $user.tar $home >$user.dir 2>>error.log";

-Joe

P.S. Next time, do not include comp.lang.perl; it has been replaced
by the comp.lang.perl.misc newsgroup.
Aug 8 '05 #3
> > Please don't tell me to just tar/gz the /Users/ directory. That will
not
work for this because it will be greater than 4GBs,


So?


You don't get it do you?
Aug 9 '05 #4
"Shabam" <ch*****@yomama-nospam.com> writes:
My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".


You don't even need to use perl, you can do this directly in bash:

for k in /Users/*/*/; do run_backup_script "$k"; done

The perl equivalent would look similar but IIRC be a bit mor involved.

Matthew

--
I must take issue with the term "a mere child," for it has been my
invariable experience that the company of a mere child is infinitely
preferable to that of a mere adult.
-- Fran Lebowitz
Aug 9 '05 #5
Shabam wrote:
Please don't tell me to just tar/gz the /Users/ directory.
That will not work for this because it will be greater than 4GBs,


So?


You don't get it do you?


Get what? Modern versions of tar can create archive files of
greater than 2 or 4 gigabytes.

linux% ls -l 5gigabyte.zip
-rw-r--r-- 1 jms jms 5751592946 May 3 19:37 5gigabyte.zip
linux% tar cf 5gb.tar 2005-03-01.zip
linux% ls -l 5gb.tar
-rw-r--r-- 1 jms jms 5751603200 Aug 9 22:30 5gb.tar

So why do you say 4GB wont work?

-Joe
Aug 10 '05 #6
On 2005-08-10, Joe Smith <jo*@inwap.com> wrote:
Shabam wrote:
Please don't tell me to just tar/gz the /Users/ directory.
That will not work for this because it will be greater than 4GBs,

So?


You don't get it do you?


Get what? Modern versions of tar can create archive files of
greater than 2 or 4 gigabytes.


Maybe the OP has a DAT drive that doesn't support tapes bigger than
2/4GB?

Justin.

--
Justin C, by the sea.
Aug 10 '05 #7
good
how do i know that ?
and this is a test..

--
?????????
?!
?????????????????
??**********************@baidu.com
???????????????
"Joe Smith" <jo*@inwap.com> ???? news:Cr********************@comcast.com...
Shabam wrote:
Please don't tell me to just tar/gz the /Users/ directory.
That will not work for this because it will be greater than 4GBs,

So?


You don't get it do you?


Get what? Modern versions of tar can create archive files of
greater than 2 or 4 gigabytes.

linux% ls -l 5gigabyte.zip
-rw-r--r-- 1 jms jms 5751592946 May 3 19:37 5gigabyte.zip
linux% tar cf 5gb.tar 2005-03-01.zip
linux% ls -l 5gb.tar
-rw-r--r-- 1 jms jms 5751603200 Aug 9 22:30 5gb.tar

So why do you say 4GB wont work?

-Joe

Aug 23 '05 #8
"Shabam" <ch*****@yomama-nospam.com> writes:
I have a command script that backs up a user account. This involves moving
files from different directories into an archive.

Now, I need that script to back up all user accounts on the system, by going
through the directory structure and running the backup script on each one.
Can someone show me how this can be done? I'm not a perl programmer and
have only dabbled a bit in it.

My directory structure is like this:

/Users/0/
/Users/1/
/Users/2/
/Users/3/
... so on...

User account names reside in those folders, so user jason would be in
"Users/j/jason".

Please don't tell me to just tar/gz the /Users/ directory. That will not
work for this because it will be greater than 4GBs, and it won't allow me to
restore accounts individually.


Firstly, if your not a perl programmer, why do you plan to use perl
for this task? This could easily be done with just a bash script.

Secondly, your statement about not being able to extract individual
account data from a single tar file is incorrect. You can extract
individual files or groups of files from a tar archive.

The basic building blocks for your script are two loops. The outer
loop goes through the outer list of directories and for each of those,
the inner loop goes through the user accounts in each directory and
processes them in whatever way you want.

The perl functions you probably want are opendir and readdir. Try
perldoc -f readdir, but to be honest, if your not a perl programmer,
save yourself time and just use bash (unless you want to learn perl).

Tim

--
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you
really need to send mail, you should be able to work it out!
Sep 25 '05 #9

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

15 posts views Thread by Kim Jensen | last post: by
2 posts views Thread by Dean | last post: by
19 posts views Thread by SU News Server | last post: by
8 posts views Thread by dougawells | last post: by
4 posts views Thread by techusky | last post: by
1 post views Thread by mocolvin | last post: by
1 post views Thread by CARIGAR | last post: by
reply views Thread by Purva khokhar | last post: by
1 post views Thread by haryvincent176 | last post: by
1 post views Thread by Marylou17 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.