I have written a perl script to parse some large flat-text logs. The
logs are bzipped and come in through a pipe to STDIN. The script then
performs some regular expressions to the incoming data then prints to
STDOUT.
The script works great, but the issue I have is that the script uses
as much memory as the data coming into it. Therefore if I pipe a 800MB
file into it, the memory usage grows and grows until it reaches
approximatly 800MB, and it doesn't appear to release any of the memory
until it is completely finished. Is there a way to have Perl not use
so much of the system memory? Here is a sample of what I am doing.
#!/usr/bin/perl -w
use strict;
if ($ARGV[0]){
open STDIN, "bzcat $ARGV[0]|" or die "Cant uncompress file as a
pipe\n$!\n";
}
foreach (<STDIN>) {
chomp;
if ($_ =~ /(somedata)/) {
print "$1\n";
}
}