On Nov 13, 4:44 pm, Roger <wondering...@gmail.comwrote:
I have a siebel crm application thats cutting and archiving logs every
minute. Here is the db cfg
Log buffer size (4KB) (LOGBUFSZ) = 512
Log file size (4KB) (LOGFILSIZ) = 15000
Number of primary log files (LOGPRIMARY) = 30
Number of secondary log files (LOGSECOND) = 40
Group commit count (MINCOMMIT) = 1
How can I reduce the number of logs being cut ? I don't understand
whats driving DB2 to cut logs every minute.
Roger
Hi Roger,
The workload generally drives the amount of data that is logged and in
this case it's about 60MB (15000 4K pages) per minute. Does that seem
unreasonable given what's happening on the system? Is this actually a
problem for you, or are you just wondering what's happening? You
could consider increasing LOGFILSIZ if this would help (but depending
on how you're using your archived logs -- are they for DR purposes? --
then you'll have to take the longer time between archives into
consideration).
Another possibility is that the application is connecting, doing some
work, disconnecting (leaving nobody else connected), and then
repeating the process. In this case the current log will be truncated
and it will be archived when the database starts up next. If this is
in fact the problem then you could consider keeping the database
activated (using ACTIVATE DATABASE). I don't know Siebel's behavior
but I would guess that this probably isn't the problem. Are the logs
being archived all 60MB in size, or do they generally look like
they've been truncated?
Regards,
Kelly Schlamb