By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
455,851 Members | 1,410 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 455,851 IT Pros & Developers. It's quick & easy.

Closing File handlers when script stops Abruptly

P: 6
Hi

I have a perl script which deals with files like this. My aim is to close the handlers safely eventhough the perl script execution is stopped abruptly.

How can i do this.

Pls give me a code sample. Thanks

Expand|Select|Wrap|Line Numbers
  1. open(STATUSLOG,">>"$StatusFile);
  2. print(STATUSLOG "COMPLETED");
  3. close(STATUSLOG);
  4.  
Jan 17 '08 #1
Share this Question
Share on Google+
5 Replies


numberwhun
Expert Mod 2.5K+
P: 3,503
Hi

I have a perl script which deals with files like this. My aim is to close the handlers safely eventhough the perl script execution is stopped abruptly.

How can i do this.

Pls give me a code sample. Thanks

Expand|Select|Wrap|Line Numbers
  1. open(STATUSLOG,">>"$StatusFile);
  2. print(STATUSLOG "COMPLETED");
  3. close(STATUSLOG);
  4.  
Can you better define "stopped abruptly"? Are you refering to a routine in your script that exits on a condition? If so, put the close before the exit in that routine. If you are refering to if your script is running and someone kills its process, abrublty killing its run? I don't know if its possible. Perl should close the file handles if you don't I believe.

Regards,

Jeff
Jan 17 '08 #2

prn
Expert 100+
P: 254
prn
Hi finny,

A more or less standard idiom for handling cases where you think an error might occur is something like:
Expand|Select|Wrap|Line Numbers
  1. opendir (DIR, $the_dir) or die "Could not open $the_dir for read: $! \n";
In the case of simply closing file handles, you probably don't need to worry. Open file handles are closed automatically. But, sometimes you want to do some other cleanup before your program dies.

Here's a sample of the sort of thing I have been known to do.
Expand|Select|Wrap|Line Numbers
  1. #! /bin/perl -w
  2. use File::Basename;
  3. use FileHandle;
  4. use IO::Socket;
  5. use strict;
  6. my $mydir = dirname($0);
  7. my $myname = basename($0);
  8. my $mylog  = "$mydir/logs/$myname.log";
  9. my $errlog = "$mydir/logs/$myname.err";
  10. ...
  11. opendir (DIR, $the_dir) or ERREXIT("Could not open $the_dir for read: $! \n");
  12. ...
  13. sub ERREXIT {
  14.   my $message = shift;
  15.   kill TERM => `cat $Checker.pid`;
  16.   kill TERM => `cat $Zuni.pid`;
  17.   print MYLOG "$myname error exit at ", scalar localtime ,"\n";
  18.   if ($DEBUG) {
  19.     print "Error exit here, in debug mode, with message \"$message\" \n";
  20.     &logit ("TEST: $myname exits with message \"$message\"");
  21.     close STDERR;
  22.     close SAVEERR;
  23.     print MYLOG "#====#====#====#====#====#====#====#====#\n\n";
  24.     close MYLOG;
  25.   }else{
  26.     print STDERR "$message\n";
  27.     print MYLOG "$message\n";
  28.     print MYLOG "Sending mail and quitting \n";
  29.     &logit ("NOTICE: $myname exits with message \"$message\"");
  30.     close STDERR;
  31.     close SAVEERR;
  32.     print MYLOG "#====#====#====#====#====#====#====#====#\n\n";
  33.     close MYLOG;
  34.     system "cat $mydir/logs/$myname.* | mail -s \"$message\" $people";
  35.     print "Mail sent\n";
  36.   }
  37.   exit 1;
  38. }
  39.  
This example is from a nightly (about 2am) cleanup program that needs to shut down a web server, do a number of tasks that cannot be done with the server running, then start it back up again. I log EVERYTHING. (I'm really paranoid about this. It's a very important process and I need to know that it is working properly and especially if it fails.) I redirect STDERR to $mylog (with the normal STDERR saved in SAVEERR, see the Camel book for an example). And I also have a variable $DEBUG that I can set so that I can do a dry run, logging everything I would do without actually doing it.

While this process is running, one of the first things it does is start up "zuni" a substitute "web server" (Zuni is not Apache, it is just a very simple perl script that I wrote that does not actually serve anything except a message telling anyone who hits this site that it is temporarily down and should be back up soon, giving an estimated time). When the main cleanup process exits, it kills zuni so that apache can have the port back. If an error occurs, I have to do the same thing or I will not be able to start apache back up. So this ERREXIT sub does that, kills another spawned process also, logs the error message both in its own logs AND in syslog so that yet another process that monitors the system (including the logs) will note that and notify the computer-room operator in a standard way. It also sends the logs by email to me and whoever else is listed in people.pm (where several different notification routines look for the list). Then if finally closes a couple of file handles and exits with an error condition (1).

Your error-handling requirements may turn out to be less complicated, but this should give you some clues about the kind of thing that you might do in an error-exit cleanup routine.

HTH,
Paul
Jan 17 '08 #3

prn
Expert 100+
P: 254
prn
Can you better define "stopped abruptly"? Are you refering to a routine in your script that exits on a condition? If so, put the close before the exit in that routine. If you are refering to if your script is running and someone kills its process, abrublty killing its run? I don't know if its possible. Perl should close the file handles if you don't I believe.

Regards,

Jeff
I forgot to mention in my discussion of the error exit routine, that you can catch process kills also. A "nice" kill ought to show up as $SIG[TERM] and can be caught with:
Expand|Select|Wrap|Line Numbers
  1. $SIG{QUIT}=$SIG{TERM} = \&Scram;
where, in this case sub Scram() logs the message it received and can then pass on a message to ERREXIT with an overall graceful close. Of course "kill -9" is less "nice" and does not give the process the opportunity to exit gracefully, but then neither does pulling the plug or various other ways of killing the whole thing. For those, there's just no graceful way of handling anything.

Paul
Jan 17 '08 #4

P: 6
I define abruptly as like this . " The code had opened the handler. It is processing the second line of code. The console window is closed at that moment. So at that time what will happen to the third line which closes the file handler. Will it close the handler or ...? "

I believe i have explained clearly. If you still need more explanation pls get back to me.

Thanks. Looking forward for your help.

Can you better define "stopped abruptly"? Are you refering to a routine in your script that exits on a condition? If so, put the close before the exit in that routine. If you are refering to if your script is running and someone kills its process, abrublty killing its run? I don't know if its possible. Perl should close the file handles if you don't I believe.

Regards,

Jeff
Jan 18 '08 #5

prn
Expert 100+
P: 254
prn
I define abruptly as like this . " The code had opened the handler. It is processing the second line of code. The console window is closed at that moment. So at that time what will happen to the third line which closes the file handler. Will it close the handler or ...? "

I believe i have explained clearly. If you still need more explanation pls get back to me.

Thanks. Looking forward for your help.
Hi finny,

What happens to a process when a terminal window (generally speaking "console" implies a physical device, or at least a KVM, to me) is closed unexpectedly is not predictable in the abstract. A lot depends on too many other factors. For one thing you have not even told us what kind of OS you are dealing with. I tend to figure *ix more or less by default, but that is not at all guaranteed. Even there, it all depends on how the process tree is set up.

Now, back to the original question. IF you are dealing with any moderately well-behaved OS, then file handles will be closed and released when the process ends. (This does not guarantee that cached writes will actually make it into the files on disk.) Of course, you did not say "file handles", you said "handlers" and that is not a well-defined term, so who knows? Other types of cleanup may need to be handled differently, and that was the point of my posts.

It is utterly impossible to handle all types of abrupt termination in code. Some types of abrupt termination may be physical, for example.

There are numerous ways of insuring against different kinds of events. If you consider it both (adequately) likely and seriously bad for a user to close his/her terminal and walk away, then (again, depending on OS and other facts about your environment) you could do something like making your "main program" just be a shell that starts the real operation as a detached process. We've suggested some possibilities, but at the level of abstraction we are stuck in, we're not going to solve any real-world problems. Without a clue about what kind of program you are talking about, what kind of environment you are talking about, what kind of events you are worried about, etc. we're stuck.

Now if you want to expand on your question, we might have a chance of helping you find a meaningful answer.

Regards,
Paul
Jan 18 '08 #6

Post your reply

Sign in to post your reply or Sign up for a free account.