Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?> 28 3070 tl****@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.
<?php
passthru("tail -n $lines $file");
?>
Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27, 2 users, load average: 0.89, 0.35,
0.18
<comp.lang.php>
<>
<Sat, 19 Apr 2008 14:09:39 -0700 (PDT)>
<ce**********************************@a70g2000hsh. googlegroups.com>
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
$build="/var/log/myfile";
$contents=file_get_contents($build);
$demo=explode("\n",$contents);
Something like the above might be worth a try .
For the benefit of any newbies .....
The above reads in the file in the one go and $demo in effect becomes
$demo[0] $demo[1] $demo[2] etc etc via the explode command - the
advantage being the above code is much faster than reading in the text
file one line at a time .
-- www.krustov.co.uk
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way. tl****@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
Probably running out of memory in PHP...
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
On Apr 19, 5:28 pm, Iván Sánchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.orgwrote:
tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.
<?php
passthru("tail -n $lines $file");
?>
Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27, 2 users, load average: 0.89, 0.35,
0.18
First, thank you for offering suggestions. I'll research using tail
but my initial problem with it is that it doesn't create an arrary
where I can display each line on a table row but I may be doing
something wrong...
$myfile = "/var/log/maillog";
$arrLog[]=passthru("tail -n $int_lines $myfile");
$int_number_of_lines = count($myfile);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file. Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error. At 4.8MB (last
confirmed size)...the function still works. Any ideas what code below
is lacking??
<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
while (!feof($handle)) {
$arrLog[] = fgets($handle, 4096);
}
fclose($handle);
}
$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
echo '<p><strong>No lines read.</strong></p>';}
if ($int_number_of_lines < $int_lines)
{
$int_lines = $int_number_of_lines;}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';
echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
{
echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
}
echo "</TABLE>\n";
?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out. tl****@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
>On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>Hey, read some tips/pointers on PHP.net but can't seem to solve this problem. I have a php page that reads the contents of a file and then displays the last XX lines of the file. Problem is this...whenever the file gets larger that ~5MB, the page just displays nothing, as though a timeout has occurred but I get no error. At 4.8MB (last confirmed size)...the function still works. Any ideas what code below is lacking?? <? $handle= fopen("/var/log/myfile", "r"); if ($handle) { while (!feof($handle)) { $arrLog[] = fgets($handle, 4096); } fclose($handle); } $int_number_of_lines = count($arrLog); if ($int_number_of_lines == 0) { echo '<p><strong>No lines read.</strong></p>';} if ($int_number_of_lines < $int_lines) { $int_lines = $int_number_of_lines;} $int_firstline = $int_number_of_lines - $int_lines; echo 'Showing the last '.$int_lines.' lines out of '. $int_number_of_lines.'<BR />'; echo "<TABLE WIDTH=100% CLASS=\"mail\">\n"; for ($i=$int_firstline; $i<$int_number_of_lines; $i++) { echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n"; } echo "</TABLE>\n"; ?>
The problem is probably due to the php.ini configuration of max_execution_time. I forget the default but it's about 30 seconds. Try jacking the value up and see if it keeps executing. Though any script which takes more than 30 seconds is probably not the best solution either. Consider dumping the lines into a mysql table and searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.
In your php.ini file, put:
error_reporting = E_ALL
display_errors = on
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
tlp...@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>Hey, read some tips/pointers on PHP.net but can't seem to solve this problem. I have a php page that reads the contents of a file and then displays the last XX lines of the file. Problem is this...whenever the file gets larger that ~5MB, the page just displays nothing, as though a timeout has occurred but I get no error. At 4.8MB (last confirmed size)...the function still works. Any ideas what code below is lacking?? <? $handle= fopen("/var/log/myfile", "r"); if ($handle) { while (!feof($handle)) { $arrLog[] = fgets($handle, 4096); } fclose($handle); } $int_number_of_lines = count($arrLog); if ($int_number_of_lines == 0) { echo '<p><strong>No lines read.</strong></p>';} if ($int_number_of_lines < $int_lines) { $int_lines = $int_number_of_lines;} $int_firstline = $int_number_of_lines - $int_lines; echo 'Showing the last '.$int_lines.' lines out of '. $int_number_of_lines.'<BR />'; echo "<TABLE WIDTH=100% CLASS=\"mail\">\n"; for ($i=$int_firstline; $i<$int_number_of_lines; $i++) { echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n"; } echo "</TABLE>\n"; ?>
The problem is probably due to the php.ini configuration of
max_execution_time. I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing. Though any
script which takes more than 30 seconds is probably not the best
solution either. Consider dumping the lines into a mysql table and
searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.
In your php.ini file, put:
error_reporting = E_ALL
display_errors = on
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================
Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?
Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39 tl****@gmail.com wrote:
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
>tlp...@gmail.com wrote:
>>On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com> wrote: On Apr 19, 5:09 pm, tlp...@gmail.com wrote: Hey, read some tips/pointers on PHP.net but can't seem to solve this problem. I have a php page that reads the contents of a file and then displays the last XX lines of the file. Problem is this...whenever the file gets larger that ~5MB, the page just displays nothing, as though a timeout has occurred but I get no error. At 4.8MB (last confirmed size)...the function still works. Any ideas what code below is lacking?? <? $handle= fopen("/var/log/myfile", "r"); if ($handle) { while (!feof($handle)) { $arrLog[] = fgets($handle, 4096); } fclose($handle); } $int_number_of_lines = count($arrLog); if ($int_number_of_lines == 0) { echo '<p><strong>No lines read.</strong></p>';} if ($int_number_of_lines < $int_lines) { $int_lines = $int_number_of_lines;} $int_firstline = $int_number_of_lines - $int_lines; echo 'Showing the last '.$int_lines.' lines out of '. $int_number_of_lines.'<BR />'; echo "<TABLE WIDTH=100% CLASS=\"mail\">\n"; for ($i=$int_firstline; $i<$int_number_of_lines; $i++) { echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n"; } echo "</TABLE>\n"; ?> The problem is probably due to the php.ini configuration of max_execution_time. I forget the default but it's about 30 seconds. Try jacking the value up and see if it keeps executing. Though any script which takes more than 30 seconds is probably not the best solution either. Consider dumping the lines into a mysql table and searching that way. Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.
In your php.ini file, put:
error_reporting = E_ALL display_errors = on
-- ================== Remove the "x" from my email address Jerry Stuckle JDS Computer Training Corp. jstuck...@attglobal.net ==================
Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?
Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Well, 8,388,608 is 8M - which just happens to be the default memory
limit for PHP. You needed more memory than is available.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
On Apr 20, 6:22 pm, tlp...@gmail.com wrote:
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.netwrote:
tlp...@gmail.com wrote:
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
>On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>Hey, read some tips/pointers on PHP.net but can't seem to solve this
>>problem. I have a php page that reads the contents of a file and then
>>displays the last XX lines of the file. Problem is this...whenever
>>the file gets larger that ~5MB, the page just displays nothing, as
>>though a timeout has occurred but I get no error. At 4.8MB (last
>>confirmed size)...the function still works. Any ideas what code below
>>is lacking??
>><?
>>$handle= fopen("/var/log/myfile", "r");
>>if ($handle) {
>> while (!feof($handle)) {
>> $arrLog[] = fgets($handle, 4096);
>> }
>> fclose($handle);
>>}
>>$int_number_of_lines = count($arrLog);
>>if ($int_number_of_lines == 0)
>>{
>> echo '<p><strong>No lines read.</strong></p>';}
>>if ($int_number_of_lines < $int_lines)
>>{
>> $int_lines = $int_number_of_lines;}
>>$int_firstline = $int_number_of_lines - $int_lines;
>>echo 'Showing the last '.$int_lines.' lines out of '.
>>$int_number_of_lines.'<BR />';
>> echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>> for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>> {
>> echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>> }
>> echo "</TABLE>\n";
>>?>
>The problem is probably due to the php.ini configuration of
>max_execution_time. I forget the default but it's about 30 seconds.
>Try jacking the value up and see if it keeps executing. Though any
>script which takes more than 30 seconds is probably not the best
>solution either. Consider dumping the lines into a mysql table and
>searching that way.
Thanks, but the page returns blank instantly...no time out.
Enable errors and display them. You'll see your problem.
In your php.ini file, put:
error_reporting = E_ALL
display_errors = on
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstuck...@attglobal.net
==================
Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608? Huh?
Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
Solved! I upped the max memory in php.ini from 8M to 16M.
Thanks for all the help. tl****@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52, 2 users, load average: 0.34, 0.37,
0.36
On Apr 20, 6:56 pm, Iván Sánchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.orgwrote:
tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52, 2 users, load average: 0.34, 0.37,
0.36
Okay, the log is rotated nightly so it *may* not be a problem but I'll
starting reading on those functions. Thanks for the assistance. But I
have a question. If I couldn't even fopen() the 5MB file before I
increased the php max size, how am I going to open it now and then do
an fseek() to find the end?
Iván Sánchez Ortega wrote:
tl****@gmail.com wrote:
>Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.
Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.
I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.netwrote:
Iván Sánchez Ortega wrote:
tlp...@gmail.com wrote:
Solved! I upped the max memory in php.ini from 8M to 16M.
That doesn't solve it - it just postpones it.
Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.
Reading 4K at a time and trying to tack things together across chunks is
also requires a much larger amount of CPU.
I often give PHP 32-128MB, depending on the system and what's required
for the site. Memory is cheaper than CPU cycles.
Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:
$lines=0;
$keep=200;
while (!feof($fp)) {
$lines++;
$buffer[$lines % $keep]=fgets($fp);
}
fclose($fp);
for($x=($lines % $keep); $x<=$keep; $x++) {
print $buffer[$x];
}
for ($x=0; $x<($lines % $keep); $x++) {
print $buffer[$x];
}
C.
C. ( http://symcbean.blogspot.com/) wrote:
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.netwrote:
>Iván Sánchez Ortega wrote:
>>tlp...@gmail.com wrote: Solved! I upped the max memory in php.ini from 8M to 16M. That doesn't solve it - it just postpones it. Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find the line breaks. No wasted memory.
Maybe, maybe not. It depends on how large the file gets.
Reading 4K at a time and trying to tack things together across chunks is also requires a much larger amount of CPU.
I often give PHP 32-128MB, depending on the system and what's required for the site. Memory is cheaper than CPU cycles.
Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:
$lines=0;
$keep=200;
while (!feof($fp)) {
$lines++;
$buffer[$lines % $keep]=fgets($fp);
}
fclose($fp);
for($x=($lines % $keep); $x<=$keep; $x++) {
print $buffer[$x];
}
for ($x=0; $x<($lines % $keep); $x++) {
print $buffer[$x];
}
C.
Yes, tail is one way to do it. But it also means you must have the
privileges to exec tail. Many (most?) shared hosts don't allow this.
And multiple seeks, etc. are again a cpu hog. They can work if you have
an idea what the size of your lines are, but if, like many log files,
the lines vary considerably in length, it's much harder. And the code
is far more complicated.
I just find it much easier to get a decent amount of RAM allocated to
PHP and read the entire file in. It's not like you're talking 100MB or
anything.
But if you are, then you need to take further steps to break it up.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
Jerry Stuckle wrote:
>As compared to loading and looking through the entire file? I have to remind you that the worst speed hog here is disk access. You *do* want to avoid unneccesary disk access. And the only way to do that is by searching line breaks from the end of the file, using fseek().
Nowadays, disk transfer is done by hardware - especially in servers.
While the disk is seeking, etc., the CPU can be handling other requests.
Only during the relatively short reads is the bus tied up for I/O. The
joys of multiprocessing.
Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.
And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.
Again, I don't think it's a good idea to load the entire file in memory.
Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Los fotógrafos lo hacen con la luz apagada.
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>>As compared to loading and looking through the entire file? I have to remind you that the worst speed hog here is disk access. You *do* want to avoid unneccesary disk access. And the only way to do that is by searching line breaks from the end of the file, using fseek().
Nowadays, disk transfer is done by hardware - especially in servers. While the disk is seeking, etc., the CPU can be handling other requests. Only during the relatively short reads is the bus tied up for I/O. The joys of multiprocessing.
Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.
Sure. But the CPU is doing other things at the time.
And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.
Nope. But there's a lot going on which doesn't require disk I/O. And in
most active (and properly configured) web servers, a good portion of
what is served comes from cache.
With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be processed.
Again, I don't think it's a good idea to load the entire file in memory.
Cheers,
It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
Iván Sánchez Ortega wrote:
>Again, I don't think it's a good idea to load the entire file in memory.
Cheers,
It's fine for you to disagree. I don't see a problem when you have a
file which will be known not to grow to 100 MB.
.... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.
--
"HTML's a cheap whore. Treating her with respect is possible, and even
preferable, because once upon a time she was a beautiful and virginal
format, but you shouldn't expect too much of her at this point." --M"K"H
Peter H. Coffin wrote:
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
>Iván Sánchez Ortega wrote:
>>Again, I don't think it's a good idea to load the entire file in memory.
Cheers,
It's fine for you to disagree. I don't see a problem when you have a file which will be known not to grow to 100 MB.
... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.
Sure, it's disagreement. I say it's only a 5MB file, and not likely to
grow to a huge size - go ahead and load it into memory. Iván says read
the file in chunks. Sounds like a disagreement to me :-)
If we were talking hundreds of megabytes, I'd have a different answer.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
Jerry Stuckle wrote:
With your code it takes many more CPU cycles to accomplish the same
thing, during which time nothing else requiring CPU cycles can be
processed.
Pardon me?
Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).
Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).
What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).
Cheers,
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Un ordenador no es un televisor ni un microondas, es una herramienta
compleja.
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>With your code it takes many more CPU cycles to accomplish the same thing, during which time nothing else requiring CPU cycles can be processed.
Pardon me?
Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).
Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).
What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).
Cheers,
tail is a compiled program. It is much more efficient than an
interpreted one.
And the only searching the program has to do is for the new line
character. Even in an interpreted language, that can be optimized be
quite a fast operation.
As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
Jerry Stuckle wrote:
As opposed to multiple calls to seek and read the file, doing your own
searching... Much more code to go through and much more cpu intensive.
Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).
It just doesn't hold.
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
MSN:i_*************************@hotmail.com
Jabber:iv*********@jabber.org ; iv*********@kdetalk.net
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>As opposed to multiple calls to seek and read the file, doing your own searching... Much more code to go through and much more cpu intensive.
Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).
It just doesn't hold.
You're assuming the path through the code is the same - or at least the
same length in cpu cycles. It isn't - not by a long shot.
Your argument is highly fallacious.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
Jerry Stuckle wrote:
Iván Sánchez Ortega wrote:
>The longer "seek and read" algorithm has a complexity of O(n), whereas the "file() - count() - for()" has O(n^2*log(n)).
Your argument is highly fallacious.
Would you please elaborate?
--
----------------------------------
Iván Sánchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
Q: How does a hacker fix a function which
doesn't work for all of the elements in its domain?
A: He changes the domain.
Iván Sánchez Ortega wrote:
Jerry Stuckle wrote:
>Iván Sánchez Ortega wrote:
>>The longer "seek and read" algorithm has a complexity of O(n), whereas the "file() - count() - for()" has O(n^2*log(n)).
Your argument is highly fallacious.
Would you please elaborate?
You assume either way takes the same number of cpu cycles. file() is a
single call to fetch the entire file. Searching for the new line
characters is also very fast, in cpu cycles (it can be highly optimized
in machine code). fopen(), then multiple fseek(), fread() and searching
yourself for the newline characters, followed by fclose() is much more
cpu intensive. This is true in a compiled language also, but in an
interpreted language the difference is even greater.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
On Tue, 22 Apr 2008 21:40:01 -0400, Jerry Stuckle wrote:
[putolin]
Don't even try to compare performance in a compiled language vs. an
interpreted one. It's comparing apples and oranges.
Ever heard of perl?
--
Tayo'y Mga Pinoy
<comp.lang.php>
<Baho Utot>
<Wed, 23 Apr 2008 16:45:11 -0400>
<pa*********************@bildanet.com>
Ever heard of perl?
Isnt she a singer ? .
-- www.krustov.co.uk This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: John Flynn |
last post by:
hi,
having problems reading from and writing back to the same file.
basically, i want to read lines of text from a file and reverse them and
write them back to the same file.. it has to...
|
by: Magix |
last post by:
Hi,
I have these string data: str_data1, str_data2, str_data3, which capture
some value after a routine process A. Then I would like to write (append)
these 3 string values into a text file each...
|
by: RyanS09 |
last post by:
Hello-
I am trying to write a snippet which will open a text file with an
integer on each line. I would like to read the last integer in the
file. I am currently using:
file = fopen("f.txt",...
|
by: nicolasg |
last post by:
Hi,
I'm trying to open a file (any file) in binary mode and save it inside
a new text file.
After that I want to read the source from the text file and save it
back to the disk with its...
|
by: =?Utf-8?B?ZGF2aWQ=?= |
last post by:
I try to follow Steve's paper to build a database, and store a small text
file into SQL Server database and retrieve it later. Only difference between
my table and Steve's table is that I use NTEXT...
|
by: Ray |
last post by:
Hello World,
I made a Windowsform that reads data from a CSV file.
It works fine, but when I have read the data of a record I have to re-Debug
the form to read another record.
So when I put a...
|
by: Thomas Kowalski |
last post by:
Hi,
currently I am reading a huge (about 10-100 MB) text-file line by line
using
fstreams and getline. I wonder whether there is a faster way to read a
file line by line (with std::string line)....
|
by: portCo |
last post by:
Hello there,
I am creating a vb application which is some like like a questionare.
Application read a text file which contains many questions and display
one question and the input is needed...
|
by: alivip |
last post by:
Is python provide search in parent folder contain sub folders and files
for example folder name is cars and sub file is Toyota,Honda and BMW and Toyota contain file name camry and file name corola,...
|
by: Keith G Hicks |
last post by:
I'm trying to read a text file and alter the contents of specific lines in
the file. I know how to use streamreader to read each line of a file. I'm
doing that already to get the data into a...
|
by: isladogs |
last post by:
The next Access Europe meeting will be on Wednesday 2 August 2023 starting at 18:00 UK time (6PM UTC+1) and finishing at about 19:15 (7.15PM)
The start time is equivalent to 19:00 (7PM) in Central...
|
by: erikbower65 |
last post by:
Using CodiumAI's pr-agent is simple and powerful. Follow these steps:
1. Install CodiumAI CLI: Ensure Node.js is installed, then run 'npm install -g codiumai' in the terminal.
2. Connect to...
|
by: linyimin |
last post by:
Spring Startup Analyzer generates an interactive Spring application startup report that lets you understand what contributes to the application startup time and helps to optimize it. Support for...
|
by: erikbower65 |
last post by:
Here's a concise step-by-step guide for manually installing IntelliJ IDEA:
1. Download: Visit the official JetBrains website and download the IntelliJ IDEA Community or Ultimate edition based on...
|
by: kcodez |
last post by:
As a H5 game development enthusiast, I recently wrote a very interesting little game - Toy Claw ((http://claw.kjeek.com/))。Here I will summarize and share the development experience here, and hope it...
|
by: DJRhino1175 |
last post by:
When I run this code I get an error, its Run-time error# 424 Object required...This is my first attempt at doing something like this. I test the entire code and it worked until I added this -
If...
|
by: DJRhino |
last post by:
Private Sub CboDrawingID_BeforeUpdate(Cancel As Integer)
If = 310029923 Or 310030138 Or 310030152 Or 310030346 Or 310030348 Or _
310030356 Or 310030359 Or 310030362 Or...
|
by: lllomh |
last post by:
Define the method first
this.state = {
buttonBackgroundColor: 'green',
isBlinking: false, // A new status is added to identify whether the button is blinking or not
}
autoStart=()=>{
|
by: DJRhino |
last post by:
Was curious if anyone else was having this same issue or not....
I was just Up/Down graded to windows 11 and now my access combo boxes are not acting right. With win 10 I could start typing...
| |