470,647 Members | 1,085 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 470,647 developers. It's quick & easy.

Perl - Memory Issues

I have written a perl script to parse some large flat-text logs. The
logs are bzipped and come in through a pipe to STDIN. The script then
performs some regular expressions to the incoming data then prints to
STDOUT.

The script works great, but the issue I have is that the script uses
as much memory as the data coming into it. Therefore if I pipe a 800MB
file into it, the memory usage grows and grows until it reaches
approximatly 800MB, and it doesn't appear to release any of the memory
until it is completely finished. Is there a way to have Perl not use
so much of the system memory? Here is a sample of what I am doing.

#!/usr/bin/perl -w

use strict;

if ($ARGV[0]){
open STDIN, "bzcat $ARGV[0]|" or die "Cant uncompress file as a
pipe\n$!\n";
}

foreach (<STDIN>) {
chomp;
if ($_ =~ /(somedata)/) {
print "$1\n";
}
}
Jul 19 '05 #1
3 6662
James B. wrote:
The script works great, but the issue I have is that the script uses
as much memory as the data coming into it. Therefore if I pipe a 800MB
file into it, the memory usage grows and grows until it reaches
approximatly 800MB, and it doesn't appear to release any of the memory
until it is completely finished. Is there a way to have Perl not use
so much of the system memory? Here is a sample of what I am doing.
[...] foreach (<STDIN>) {


Well, here you are computing the full array of all lines from STDIN, then
foreach loops through them.
Why don't you use the more typical
while (<STDIN>)
which will process line by line?

jue
Jul 19 '05 #2
"James B." <ja************@jsc.nasa.gov> wrote in message
news:75**************************@posting.google.c om...
I have written a perl script to parse some large flat-text logs. The
logs are bzipped and come in through a pipe to STDIN. The script then
performs some regular expressions to the incoming data then prints to
STDOUT.

The script works great, but the issue I have is that the script uses
as much memory as the data coming into it. Therefore if I pipe a 800MB
file into it, the memory usage grows and grows until it reaches
approximatly 800MB, and it doesn't appear to release any of the memory
until it is completely finished. Is there a way to have Perl not use
so much of the system memory? Here is a sample of what I am doing.

#!/usr/bin/perl -w

use strict;

if ($ARGV[0]){
open STDIN, "bzcat $ARGV[0]|" or die "Cant uncompress file as a
pipe\n$!\n";
}

foreach (<STDIN>) {
chomp;
if ($_ =~ /(somedata)/) {
print "$1\n";
}
}

Change

foreach (<STDIN>) {

to

while (<STDIN>) {

The foreach construct reads every line in and then loops through them one at
a time.
The while construct reads one line in at a time and executes the loop.

--
Shawn

Jul 19 '05 #3
Wow, what a simple soultion. Works great, Thanks.

James
Jul 19 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

9 posts views Thread by I Report, You Decide | last post: by
3 posts views Thread by Myron Turner | last post: by
3 posts views Thread by John Smith | last post: by
2 posts views Thread by Gary Harvey | last post: by
reply views Thread by Kirt Loki Dankmyer | last post: by
3 posts views Thread by MarkW | last post: by
6 posts views Thread by surfivor | last post: by
12 posts views Thread by disappearedng | last post: by
1 post views Thread by Korara | last post: by
reply views Thread by warner | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.