I'm not too sure if this will help, but I got it from Oracle's website...
Rotating logs
The log files will quickly grow quite large because of the performance
logging data.
You may wish to set up a rotation system, so that periodically the logs are
archived.
Note that on UNIX, the logs are kept open by the running process, so simple
doing a 'mv' will not be effective.
Example:
export date=`date +%Y.%m.%d_%H:%M `
export ORACLE_HOME=$HO ME/app/midtier
export LOG_ARCHIVE=$HO ME/logging/archive
export LOG_HOME=$HOME/logging
cd $ORACLE_HOME/Apache/Apache/logs
cp error_log $LOG_ARCHIVE/error_log.$date && cat /dev/null > error_log
cd
$ORACLE_HOME/j2ee/OC4J_Portal/application-deployments/portal/OC4J_Portal_def ault_island_1
cp application.log $LOG_ARCHIVE/application.log .$date && cat /dev/null >
application.log
cd $ORACLE_HOME/webcache/logs
cp access_log $LOG_ARCHIVE/access_log.$dat e && cat /dev/null > access_log
It is suggested that you load the archived log files rather than the current
ones, to eliminate the need to delete
old data first. If you collect and load the current log files, then be sure
to specify -delete_old_logs , otherwise,
reloading these files will cause duplicate data to be loaded.
Automation
You can install the previous script as a cron job. See the UNIX man page for
cron for more information.
Purging/deleting old data
The amount of data in the OWA_LOGGER table can grow quite large. You should
purge or delete old data periodically.
If you have only a few physical hosts, this can be combined with loading new
data:
perl loadlogs.pl -logical_host <host> -physical_hosts "<host1 host2
host3...>" -delete_old_logs
or
perl loadlogs.pl -logical_host <host> -physical_hosts "<host1 host2
host3...>" -purge_old_logs <n>
However, it is much more efficient to purge data for all hosts at once
before loading new data:
perl loadlogs.pl -purge_old_logs <n>
perl loadlogs.pl -logical_host <host> -physical_hosts "<host1 host2
host3...>"
Automation
You can install the previous commands as a cron job. See the UNIX man page
for cron for more information.
I do hope it helps,
Scott
<sh*****@igmax. com> wrote in message
news:11******** **************@ f14g2000cwb.goo glegroups.com.. .
Chris,
Thanks for your response. The file is an Oracle 9iAS log file that
gets written to every few seconds (apache/logs/access_log.) It grows
by about 20 meg each day. I have some third party software (FogLight)
that monitors Oracle for me, but that file overgrew the ability of
FogLight to open it. (It is currently at 6 gig)
I tried to work with Oracle to come up with a solution, but their
"fixes" didn't work, and the work-around they have is to archive and
truncate that file on a regular basis. That's not a problem, except
that I can't get it to truncate! It is being used by some process.
Oracle told me which "service" it is that keeps that file open, and I
tried shutting that service down (in fact I shut down every oracle
service on the machine!) but I get the same error.
I tried booting in Safe Mode, but Server 2003 doesn't give me that
option in its startup sequence (either that or I'm just a programmer
instead of a network administrator and I'm missing something obvious.)
I see the previous post that mentions a piece of software that tells
what process is holding a file open. perhaps that will do it for me!
Thanks for your post!
Shandra