469,925 Members | 1,541 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,925 developers. It's quick & easy.

PHP Read Text File

Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";

?>
Jun 2 '08 #1
28 2868
tl****@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.
<?php
passthru("tail -n $lines $file");
?>

Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27, 2 users, load average: 0.89, 0.35,
0.18

Jun 2 '08 #2
<comp.lang.php>
<>
<Sat, 19 Apr 2008 14:09:39 -0700 (PDT)>
<ce**********************************@a70g2000hsh. googlegroups.com>
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
$build="/var/log/myfile";
$contents=file_get_contents($build);
$demo=explode("\n",$contents);

Something like the above might be worth a try .
For the benefit of any newbies .....

The above reads in the file in the one go and $demo in effect becomes
$demo[0] $demo[1] $demo[2] etc etc via the explode command - the
advantage being the above code is much faster than reading in the text
file one line at a time .
--
www.krustov.co.uk
Jun 2 '08 #3
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);

}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}

if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}

$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';

echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";

?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Jun 2 '08 #4
tl****@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";

?>
Probably running out of memory in PHP...

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #5
On Apr 19, 5:28 pm, Iván Sánchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.orgwrote:
tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.

<?php
passthru("tail -n $lines $file");
?>

Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27, 2 users, load average: 0.89, 0.35,
0.18
First, thank you for offering suggestions. I'll research using tail
but my initial problem with it is that it doesn't create an arrary
where I can display each line on a table row but I may be doing
something wrong...

$myfile = "/var/log/maillog";
$arrLog[]=passthru("tail -n $int_lines $myfile");

$int_number_of_lines = count($myfile);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
Jun 2 '08 #6
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>

The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Jun 2 '08 #7
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>

The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Jun 2 '08 #8
tl****@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
>On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.

Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #9
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
tlp...@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.

Enable errors and display them. You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================
Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Jun 2 '08 #10
tl****@gmail.com wrote:
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
>tlp...@gmail.com wrote:
>>On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================

Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Well, 8,388,608 is 8M - which just happens to be the default memory
limit for PHP. You needed more memory than is available.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #11
On Apr 20, 6:22 pm, tlp...@gmail.com wrote:
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
tlp...@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
>On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>Hey, read some tips/pointers on PHP.net but can't seem to solve this
>>problem. I have a php page that reads the contents of a file and then
>>displays the last XX lines of the file. Problem is this...whenever
>>the file gets larger that ~5MB, the page just displays nothing, as
>>though a timeout has occurred but I get no error. At 4.8MB (last
>>confirmed size)...the function still works. Any ideas what code below
>>is lacking??
>><?
>>$handle= fopen("/var/log/myfile", "r");
>>if ($handle) {
>> while (!feof($handle)) {
>> $arrLog[] = fgets($handle, 4096);
>> }
>> fclose($handle);
>>}
>>$int_number_of_lines = count($arrLog);
>>if ($int_number_of_lines == 0)
>>{
>> echo '<p><strong>No lines read.</strong></p>';}
>>if ($int_number_of_lines < $int_lines)
>>{
>> $int_lines = $int_number_of_lines;}
>>$int_firstline = $int_number_of_lines - $int_lines;
>>echo 'Showing the last '.$int_lines.' lines out of '.
>>$int_number_of_lines.'<BR />';
>> echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>> for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>> {
>> echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>> }
>> echo "</TABLE>\n";
>>?>
>The problem is probably due to the php.ini configuration of
>max_execution_time. I forget the default but it's about 30 seconds.
>Try jacking the value up and see if it keeps executing. Though any
>script which takes more than 30 seconds is probably not the best
>solution either. Consider dumping the lines into a mysql table and
>searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.
In your php.ini file, put:
error_reporting = E_ALL
display_errors = on
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================

Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Solved! I upped the max memory in php.ini from 8M to 16M.

Thanks for all the help.
Jun 2 '08 #12
tl****@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52, 2 users, load average: 0.34, 0.37,
0.36

Jun 2 '08 #13
On Apr 20, 6:56 pm, Iván Sánchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.orgwrote:
tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.

That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52, 2 users, load average: 0.34, 0.37,
0.36
Okay, the log is rotated nightly so it *may* not be a problem but I'll
starting reading on those functions. Thanks for the assistance. But I
have a question. If I couldn't even fopen() the 5MB file before I
increased the php max size, how am I going to open it now and then do
an fseek() to find the end?
Jun 2 '08 #14
Iván Sánchez Ortega wrote:
tl****@gmail.com wrote:
>Solved! I upped the max memory in php.ini from 8M to 16M.

That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #15
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.netwrote:
Iván Sánchez Ortega wrote:
tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

Maybe, maybe not. It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.
Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:

$lines=0;
$keep=200;
while (!feof($fp)) {
$lines++;
$buffer[$lines % $keep]=fgets($fp);
}
fclose($fp);
for($x=($lines % $keep); $x<=$keep; $x++) {
print $buffer[$x];
}
for ($x=0; $x<($lines % $keep); $x++) {
print $buffer[$x];
}

C.
Jun 2 '08 #16
C. (http://symcbean.blogspot.com/) wrote:
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.netwrote:
>Iván Sánchez Ortega wrote:
>>tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.

Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:

$lines=0;
$keep=200;
while (!feof($fp)) {
$lines++;
$buffer[$lines % $keep]=fgets($fp);
}
fclose($fp);
for($x=($lines % $keep); $x<=$keep; $x++) {
print $buffer[$x];
}
for ($x=0; $x<($lines % $keep); $x++) {
print $buffer[$x];
}

C.
Yes, tail is one way to do it. But it also means you must have the
privileges to exec tail. Many (most?) shared hosts don't allow this.

And multiple seeks, etc. are again a cpu hog. They can work if you have
an idea what the size of your lines are, but if, like many log files,
the lines vary considerably in length, it's much harder. And the code
is far more complicated.

I just find it much easier to get a decent amount of RAM allocated to
PHP and read the entire file in. It's not like you're talking 100MB or
anything.

But if you are, then you need to take further steps to break it up.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #17
Jerry Stuckle wrote:
>As compared to loading and looking through the entire file? I have to
remind you that the worst speed hog here is disk access. You *do* want to
avoid unneccesary disk access. And the only way to do that is by
searching line breaks from the end of the file, using fseek().

Nowadays, disk transfer is done by hardware - especially in servers.
While the disk is seeking, etc., the CPU can be handling other requests.
Only during the relatively short reads is the bus tied up for I/O. The
joys of multiprocessing.
Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.

And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.

Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Los fotógrafos lo hacen con la luz apagada.
Jun 2 '08 #18
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>>As compared to loading and looking through the entire file? I have to
remind you that the worst speed hog here is disk access. You *do* want to
avoid unneccesary disk access. And the only way to do that is by
searching line breaks from the end of the file, using fseek().
Nowadays, disk transfer is done by hardware - especially in servers.
While the disk is seeking, etc., the CPU can be handling other requests.
Only during the relatively short reads is the bus tied up for I/O. The
joys of multiprocessing.

Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.
Sure. But the CPU is doing other things at the time.
And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.
Nope. But there's a lot going on which doesn't require disk I/O. And in
most active (and properly configured) web servers, a good portion of
what is served comes from cache.

With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be processed.
Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #19
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
Iván Sánchez Ortega wrote:
>Again, I don't think it's a good idea to load the entire file in memory.

Cheers,

It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.
.... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.

--
"HTML's a cheap whore. Treating her with respect is possible, and even
preferable, because once upon a time she was a beautiful and virginal
format, but you shouldn't expect too much of her at this point." --M"K"H
Jun 2 '08 #20
Peter H. Coffin wrote:
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
>Iván Sánchez Ortega wrote:
>>Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.

... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.
Sure, it's disagreement. I say it's only a 5MB file, and not likely to
grow to a huge size - go ahead and load it into memory. Iván says read
the file in chunks. Sounds like a disagreement to me :-)

If we were talking hundreds of megabytes, I'd have a different answer.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #21
Jerry Stuckle wrote:
With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be
processed.
Pardon me?

Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).

Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).

What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).
Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Un ordenador no es un televisor ni un microondas, es una herramienta
compleja.
Jun 2 '08 #22
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be
processed.

Pardon me?

Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).

Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).

What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).
Cheers,
tail is a compiled program. It is much more efficient than an
interpreted one.

And the only searching the program has to do is for the new line
character. Even in an interpreted language, that can be optimized be
quite a fast operation.

As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #23
Jerry Stuckle wrote:
As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.
Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).

It just doesn't hold.

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

MSN:i_*************************@hotmail.com
Jabber:iv*********@jabber.org ; iv*********@kdetalk.net
Jun 2 '08 #24
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.

Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).

It just doesn't hold.
You're assuming the path through the code is the same - or at least the
same length in cpu cycles. It isn't - not by a long shot.

Your argument is highly fallacious.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #25
Jerry Stuckle wrote:
Iván Sánchez Ortega wrote:
>The longer "seek and read" algorithm has a complexity of O(n), whereas
the "file() - count() - for()" has O(n^2*log(n)).

Your argument is highly fallacious.
Would you please elaborate?

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Q: How does a hacker fix a function which
doesn't work for all of the elements in its domain?
A: He changes the domain.

Jun 2 '08 #26
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>Iván Sánchez Ortega wrote:
>>The longer "seek and read" algorithm has a complexity of O(n), whereas
the "file() - count() - for()" has O(n^2*log(n)).
Your argument is highly fallacious.

Would you please elaborate?
You assume either way takes the same number of cpu cycles. file() is a
single call to fetch the entire file. Searching for the new line
characters is also very fast, in cpu cycles (it can be highly optimized
in machine code). fopen(), then multiple fseek(), fread() and searching
yourself for the newline characters, followed by fclose() is much more
cpu intensive. This is true in a compiled language also, but in an
interpreted language the difference is even greater.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #27
On Tue, 22 Apr 2008 21:40:01 -0400, Jerry Stuckle wrote:

[putolin]
Don't even try to compare performance in a compiled language vs. an
interpreted one. It's comparing apples and oranges.
Ever heard of perl?

--
Tayo'y Mga Pinoy
Jun 2 '08 #28
<comp.lang.php>
<Baho Utot>
<Wed, 23 Apr 2008 16:45:11 -0400>
<pa*********************@bildanet.com>
Ever heard of perl?
Isnt she a singer ? .
--
www.krustov.co.uk
Jun 2 '08 #29

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

3 posts views Thread by John Flynn | last post: by
1 post views Thread by Magix | last post: by
35 posts views Thread by RyanS09 | last post: by
3 posts views Thread by nicolasg | last post: by
3 posts views Thread by =?Utf-8?B?ZGF2aWQ=?= | last post: by
6 posts views Thread by Thomas Kowalski | last post: by
6 posts views Thread by portCo | last post: by
4 posts views Thread by Keith G Hicks | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.