Hi,
we have an application which uses online backups, so archive logging
is a requirement. We have a small/moderate size database (a few
hundred megabytes isn't much by today's standards) with relatively low
insert/update activity (a few thousand per day).
How should I set up archive logging (number of primary and secondary
logs files, size of each log file)?
Also, we set log archiving to manual. What log files are safe to
archive? Currently we create an online backup every day at 2:00 AM
(the backup includes log files), keep the last 10 such backups, and
delete all log files older than the oldest backup.
The backup command is
BACKUP DB mydata TO d:\db2_backup COMPRESS UTIL_IMPACT_PRIORITY 1
INCLUDE LOGS
Logging is set up as
UPDATE DB CFG FOR mydata USING logarchmeth1 LOGRETAIN logprimary 16
logsecond 240 logfilsiz 4096 newlogpath d:\db2_logs
It is my understanding that this sets up 16 files, 16 MB each, for a
total of 256 MB, with the possibility to go up to a total of 256
files, or 4 GB, but only if needed (the application is installed at
multiple remote servers whom we cannot access every day, so I thought
this would be on the safe side). In 10 days, the logs files use up
about 20 GB. What do you recommend to keep the size lower? Is it safe
to remove all log files older than the youngest backup? Should I
change logfilesiz or logprimary/logsecond?
TIA,
Kofa