473,385 Members | 1,548 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

PHP Read Text File

Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";

?>
Jun 2 '08 #1
28 3160
tl****@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.
<?php
passthru("tail -n $lines $file");
?>

Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27, 2 users, load average: 0.89, 0.35,
0.18

Jun 2 '08 #2
<comp.lang.php>
<>
<Sat, 19 Apr 2008 14:09:39 -0700 (PDT)>
<ce**********************************@a70g2000hsh. googlegroups.com>
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
$build="/var/log/myfile";
$contents=file_get_contents($build);
$demo=explode("\n",$contents);

Something like the above might be worth a try .
For the benefit of any newbies .....

The above reads in the file in the one go and $demo in effect becomes
$demo[0] $demo[1] $demo[2] etc etc via the explode command - the
advantage being the above code is much faster than reading in the text
file one line at a time .
--
www.krustov.co.uk
Jun 2 '08 #3
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);

}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}

if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}

$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';

echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";

?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Jun 2 '08 #4
tl****@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";

?>
Probably running out of memory in PHP...

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #5
On Apr 19, 5:28 pm, Iván Sánchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.orgwrote:
tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.

<?php
passthru("tail -n $lines $file");
?>

Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27, 2 users, load average: 0.89, 0.35,
0.18
First, thank you for offering suggestions. I'll research using tail
but my initial problem with it is that it doesn't create an arrary
where I can display each line on a table row but I may be doing
something wrong...

$myfile = "/var/log/maillog";
$arrLog[]=passthru("tail -n $int_lines $myfile");

$int_number_of_lines = count($myfile);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
Jun 2 '08 #6
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>

The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Jun 2 '08 #7
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>

The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Jun 2 '08 #8
tl****@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
>On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.

Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #9
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
tlp...@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.

Enable errors and display them. You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================
Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Jun 2 '08 #10
tl****@gmail.com wrote:
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
>tlp...@gmail.com wrote:
>>On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================

Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Well, 8,388,608 is 8M - which just happens to be the default memory
limit for PHP. You needed more memory than is available.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #11
On Apr 20, 6:22 pm, tlp...@gmail.com wrote:
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
tlp...@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
>On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>Hey, read some tips/pointers on PHP.net but can't seem to solve this
>>problem. I have a php page that reads the contents of a file and then
>>displays the last XX lines of the file. Problem is this...whenever
>>the file gets larger that ~5MB, the page just displays nothing, as
>>though a timeout has occurred but I get no error. At 4.8MB (last
>>confirmed size)...the function still works. Any ideas what code below
>>is lacking??
>><?
>>$handle= fopen("/var/log/myfile", "r");
>>if ($handle) {
>> while (!feof($handle)) {
>> $arrLog[] = fgets($handle, 4096);
>> }
>> fclose($handle);
>>}
>>$int_number_of_lines = count($arrLog);
>>if ($int_number_of_lines == 0)
>>{
>> echo '<p><strong>No lines read.</strong></p>';}
>>if ($int_number_of_lines < $int_lines)
>>{
>> $int_lines = $int_number_of_lines;}
>>$int_firstline = $int_number_of_lines - $int_lines;
>>echo 'Showing the last '.$int_lines.' lines out of '.
>>$int_number_of_lines.'<BR />';
>> echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>> for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>> {
>> echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>> }
>> echo "</TABLE>\n";
>>?>
>The problem is probably due to the php.ini configuration of
>max_execution_time. I forget the default but it's about 30 seconds.
>Try jacking the value up and see if it keeps executing. Though any
>script which takes more than 30 seconds is probably not the best
>solution either. Consider dumping the lines into a mysql table and
>searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.
In your php.ini file, put:
error_reporting = E_ALL
display_errors = on
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================

Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Solved! I upped the max memory in php.ini from 8M to 16M.

Thanks for all the help.
Jun 2 '08 #12
tl****@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52, 2 users, load average: 0.34, 0.37,
0.36

Jun 2 '08 #13
On Apr 20, 6:56 pm, Iván Sánchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.orgwrote:
tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.

That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52, 2 users, load average: 0.34, 0.37,
0.36
Okay, the log is rotated nightly so it *may* not be a problem but I'll
starting reading on those functions. Thanks for the assistance. But I
have a question. If I couldn't even fopen() the 5MB file before I
increased the php max size, how am I going to open it now and then do
an fseek() to find the end?
Jun 2 '08 #14
Iván Sánchez Ortega wrote:
tl****@gmail.com wrote:
>Solved! I upped the max memory in php.ini from 8M to 16M.

That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #15
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.netwrote:
Iván Sánchez Ortega wrote:
tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

Maybe, maybe not. It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.
Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:

$lines=0;
$keep=200;
while (!feof($fp)) {
$lines++;
$buffer[$lines % $keep]=fgets($fp);
}
fclose($fp);
for($x=($lines % $keep); $x<=$keep; $x++) {
print $buffer[$x];
}
for ($x=0; $x<($lines % $keep); $x++) {
print $buffer[$x];
}

C.
Jun 2 '08 #16
C. (http://symcbean.blogspot.com/) wrote:
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.netwrote:
>Iván Sánchez Ortega wrote:
>>tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.

Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:

$lines=0;
$keep=200;
while (!feof($fp)) {
$lines++;
$buffer[$lines % $keep]=fgets($fp);
}
fclose($fp);
for($x=($lines % $keep); $x<=$keep; $x++) {
print $buffer[$x];
}
for ($x=0; $x<($lines % $keep); $x++) {
print $buffer[$x];
}

C.
Yes, tail is one way to do it. But it also means you must have the
privileges to exec tail. Many (most?) shared hosts don't allow this.

And multiple seeks, etc. are again a cpu hog. They can work if you have
an idea what the size of your lines are, but if, like many log files,
the lines vary considerably in length, it's much harder. And the code
is far more complicated.

I just find it much easier to get a decent amount of RAM allocated to
PHP and read the entire file in. It's not like you're talking 100MB or
anything.

But if you are, then you need to take further steps to break it up.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #17
Jerry Stuckle wrote:
>As compared to loading and looking through the entire file? I have to
remind you that the worst speed hog here is disk access. You *do* want to
avoid unneccesary disk access. And the only way to do that is by
searching line breaks from the end of the file, using fseek().

Nowadays, disk transfer is done by hardware - especially in servers.
While the disk is seeking, etc., the CPU can be handling other requests.
Only during the relatively short reads is the bus tied up for I/O. The
joys of multiprocessing.
Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.

And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.

Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Los fotógrafos lo hacen con la luz apagada.
Jun 2 '08 #18
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>>As compared to loading and looking through the entire file? I have to
remind you that the worst speed hog here is disk access. You *do* want to
avoid unneccesary disk access. And the only way to do that is by
searching line breaks from the end of the file, using fseek().
Nowadays, disk transfer is done by hardware - especially in servers.
While the disk is seeking, etc., the CPU can be handling other requests.
Only during the relatively short reads is the bus tied up for I/O. The
joys of multiprocessing.

Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.
Sure. But the CPU is doing other things at the time.
And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.
Nope. But there's a lot going on which doesn't require disk I/O. And in
most active (and properly configured) web servers, a good portion of
what is served comes from cache.

With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be processed.
Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #19
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
Iván Sánchez Ortega wrote:
>Again, I don't think it's a good idea to load the entire file in memory.

Cheers,

It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.
.... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.

--
"HTML's a cheap whore. Treating her with respect is possible, and even
preferable, because once upon a time she was a beautiful and virginal
format, but you shouldn't expect too much of her at this point." --M"K"H
Jun 2 '08 #20
Peter H. Coffin wrote:
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
>Iván Sánchez Ortega wrote:
>>Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.

... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.
Sure, it's disagreement. I say it's only a 5MB file, and not likely to
grow to a huge size - go ahead and load it into memory. Iván says read
the file in chunks. Sounds like a disagreement to me :-)

If we were talking hundreds of megabytes, I'd have a different answer.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #21
Jerry Stuckle wrote:
With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be
processed.
Pardon me?

Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).

Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).

What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).
Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Un ordenador no es un televisor ni un microondas, es una herramienta
compleja.
Jun 2 '08 #22
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be
processed.

Pardon me?

Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).

Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).

What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).
Cheers,
tail is a compiled program. It is much more efficient than an
interpreted one.

And the only searching the program has to do is for the new line
character. Even in an interpreted language, that can be optimized be
quite a fast operation.

As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #23
Jerry Stuckle wrote:
As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.
Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).

It just doesn't hold.

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

MSN:i_*************************@hotmail.com
Jabber:iv*********@jabber.org ; iv*********@kdetalk.net
Jun 2 '08 #24
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.

Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).

It just doesn't hold.
You're assuming the path through the code is the same - or at least the
same length in cpu cycles. It isn't - not by a long shot.

Your argument is highly fallacious.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #25
Jerry Stuckle wrote:
Iván Sánchez Ortega wrote:
>The longer "seek and read" algorithm has a complexity of O(n), whereas
the "file() - count() - for()" has O(n^2*log(n)).

Your argument is highly fallacious.
Would you please elaborate?

--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Q: How does a hacker fix a function which
doesn't work for all of the elements in its domain?
A: He changes the domain.

Jun 2 '08 #26
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>Iván Sánchez Ortega wrote:
>>The longer "seek and read" algorithm has a complexity of O(n), whereas
the "file() - count() - for()" has O(n^2*log(n)).
Your argument is highly fallacious.

Would you please elaborate?
You assume either way takes the same number of cpu cycles. file() is a
single call to fetch the entire file. Searching for the new line
characters is also very fast, in cpu cycles (it can be highly optimized
in machine code). fopen(), then multiple fseek(), fread() and searching
yourself for the newline characters, followed by fclose() is much more
cpu intensive. This is true in a compiled language also, but in an
interpreted language the difference is even greater.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
js*******@attglobal.net
==================

Jun 2 '08 #27
On Tue, 22 Apr 2008 21:40:01 -0400, Jerry Stuckle wrote:

[putolin]
Don't even try to compare performance in a compiled language vs. an
interpreted one. It's comparing apples and oranges.
Ever heard of perl?

--
Tayo'y Mga Pinoy
Jun 2 '08 #28
<comp.lang.php>
<Baho Utot>
<Wed, 23 Apr 2008 16:45:11 -0400>
<pa*********************@bildanet.com>
Ever heard of perl?
Isnt she a singer ? .
--
www.krustov.co.uk
Jun 2 '08 #29

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: John Flynn | last post by:
hi, having problems reading from and writing back to the same file. basically, i want to read lines of text from a file and reverse them and write them back to the same file.. it has to...
1
by: Magix | last post by:
Hi, I have these string data: str_data1, str_data2, str_data3, which capture some value after a routine process A. Then I would like to write (append) these 3 string values into a text file each...
35
by: RyanS09 | last post by:
Hello- I am trying to write a snippet which will open a text file with an integer on each line. I would like to read the last integer in the file. I am currently using: file = fopen("f.txt",...
3
by: nicolasg | last post by:
Hi, I'm trying to open a file (any file) in binary mode and save it inside a new text file. After that I want to read the source from the text file and save it back to the disk with its...
3
by: =?Utf-8?B?ZGF2aWQ=?= | last post by:
I try to follow Steve's paper to build a database, and store a small text file into SQL Server database and retrieve it later. Only difference between my table and Steve's table is that I use NTEXT...
3
by: Ray | last post by:
Hello World, I made a Windowsform that reads data from a CSV file. It works fine, but when I have read the data of a record I have to re-Debug the form to read another record. So when I put a...
6
by: Thomas Kowalski | last post by:
Hi, currently I am reading a huge (about 10-100 MB) text-file line by line using fstreams and getline. I wonder whether there is a faster way to read a file line by line (with std::string line)....
6
by: portCo | last post by:
Hello there, I am creating a vb application which is some like like a questionare. Application read a text file which contains many questions and display one question and the input is needed...
0
by: alivip | last post by:
Is python provide search in parent folder contain sub folders and files for example folder name is cars and sub file is Toyota,Honda and BMW and Toyota contain file name camry and file name corola,...
4
by: Keith G Hicks | last post by:
I'm trying to read a text file and alter the contents of specific lines in the file. I know how to use streamreader to read each line of a file. I'm doing that already to get the data into a...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.