This is really a reply to
http://www.thescripts.com/forum/thread50187.html
I have a script that reads in a bunch of data from txt files, then does a bunch of calculations, and outputs other txt files. I've got about 7 input txt files, each of which is 14 - 17 MB. If I open Windows Task Manager's process list I can watch Perl using increasing amounts of memory as it reads in the txt files until it chokes, at about 1.7GB (I have 2GB on my machine), which is about the 6th input txt file.
Below is a typical code block showing how I'm reading in data. Can anybody suggest a way to do the same thing without using as much memory? If not, what's the easiest way for me to tell what in my script is taking all the memory? Any help is greatly appreciated.
Expand|Select|Wrap|Line Numbers
- open TRAIL_FILE, "perl-inputs/fire_trails.asc.txt";
- my $counter = 0;
- while (<TRAIL_FILE>) {
- chomp;
- @{$trail_lines{$counter}} = split /\s+/,$_;
- $counter++;
- }
- close TRAIL_FILE;
- for (my $row = 6; $row < $nrows+6; $row++) {
- for (my $col = 0; $col < $ncols; $col++) {
- $trails{$row}{$col} = @{$trail_lines{$row}}[$col];
- } # close for on $col
- } # close for on $row