In article <1141912411.265479.258040
@p10g2000cwp.googlegroups.com>,
mm****@gmail.com says...
Hello Everyone,
I am breaking my head with this problem. Please help
me out. Let me first explain my problem :
Here it is:
I am working in realtime environment where i will be creating
logfiles internally while my system is running.
I am having a logfile.txt in which i have to store maximum of 150
lines.Suppose for every 60 seconds 50 lines are going to add to my
logfile If it exceeds more than 150 my program should remove first 50
lines and add coming 50 lines at the end.
As stated, this simply isn't possible (at least not
portably).
Your choices are 1) create something more or less along
the lines of a database, or 2) create the file anew each
time it's needed.
Given the size of file, the latter is probably the better
option in this case unless your lines are _really_ long.
With typical log file entries, you're looking at 150x40
(average) => around 6K. That just doesn't justify the
overhead of a database style of format (which wouldn't be
readable as normal text anyway).
If you have some memory available, I'd consider reading
the data in at startup, holding it in a circular buffer
in memory, and writing it out as needed. This way you're
not reading the old file back into memory on a regular
basis, cutting your I/O roughly in half.
If the log file is really crucial, instead of overwriting
the old log file, you're probably better off writing the
data to a new file, then deleting the old file and
renaming the new file to the old name.
--
Later,
Jerry.
The universe is a figment of its own imagination.