By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,946 Members | 773 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,946 IT Pros & Developers. It's quick & easy.

can't read large files - help

P: n/a
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil

Jun 14 '06 #1
Share this Question
Share on Google+
6 Replies


P: n/a

comp.lang.php wrote:
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...

Jun 15 '06 #2

P: n/a

ZeldorBlat wrote:
comp.lang.php wrote:
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil

Jun 15 '06 #3

P: n/a

comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.

Jun 15 '06 #4

P: n/a

ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
> [PHP]
> if (!function_exists('bigfile')) {
> /**
> * Works like file() in PHP except that it will work more efficiently
> with very large files
> *
> * @access public
> * @param mixed $fullFilePath
> * @return array $lineArray
> * @see actual_path
> */
> function bigfile($fullFilePath) {
> @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> ENOUGH UNTIL END!)
> $fileID = @fopen(actual_path($fullFilePath), 'r');
> while (@!feof($fileID)) {
> $buffer = @fgets($fileID, 4096);
> $lineArray[] = $buffer;
> }
> @fclose($fileID);
> return $lineArray;
> }
> }
> [/PHP]
>
> I even temporarily increase memory (I know, bad idea but it's all I can
> think of to do), however, requirements stipulate that files that are
> smaller than the max file size (arbitrarily set) be sent via email
> attachment (of course, depending on email SMTP server if it gets sent)
>
> I can't think of any other trick to make this either work or NOT to
> time out but throw an error/warning.
>
> Help!
>
> Thanx
> Phil

What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil

Jun 15 '06 #5

P: n/a

comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
> comp.lang.php wrote:
> > [PHP]
> > if (!function_exists('bigfile')) {
> > /**
> > * Works like file() in PHP except that it will work more efficiently
> > with very large files
> > *
> > * @access public
> > * @param mixed $fullFilePath
> > * @return array $lineArray
> > * @see actual_path
> > */
> > function bigfile($fullFilePath) {
> > @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > ENOUGH UNTIL END!)
> > $fileID = @fopen(actual_path($fullFilePath), 'r');
> > while (@!feof($fileID)) {
> > $buffer = @fgets($fileID, 4096);
> > $lineArray[] = $buffer;
> > }
> > @fclose($fileID);
> > return $lineArray;
> > }
> > }
> > [/PHP]
> >
> > I even temporarily increase memory (I know, bad idea but it's all I can
> > think of to do), however, requirements stipulate that files that are
> > smaller than the max file size (arbitrarily set) be sent via email
> > attachment (of course, depending on email SMTP server if it gets sent)
> >
> > I can't think of any other trick to make this either work or NOT to
> > time out but throw an error/warning.
> >
> > Help!
> >
> > Thanx
> > Phil
>
> What exactly are you trying to achieve? What do you mean by "it
> doesn't work?" Some more details will help us suggest a solution...

At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil


So let me make sure I understand this. You're trying to take a file
that's so large that the normal file handling mechanisims can't deal
with it, then send that massive file as an email attachment?

Jun 16 '06 #6

P: n/a

ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
> ZeldorBlat wrote:
> > comp.lang.php wrote:
> > > [PHP]
> > > if (!function_exists('bigfile')) {
> > > /**
> > > * Works like file() in PHP except that it will work more efficiently
> > > with very large files
> > > *
> > > * @access public
> > > * @param mixed $fullFilePath
> > > * @return array $lineArray
> > > * @see actual_path
> > > */
> > > function bigfile($fullFilePath) {
> > > @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > > ENOUGH UNTIL END!)
> > > $fileID = @fopen(actual_path($fullFilePath), 'r');
> > > while (@!feof($fileID)) {
> > > $buffer = @fgets($fileID, 4096);
> > > $lineArray[] = $buffer;
> > > }
> > > @fclose($fileID);
> > > return $lineArray;
> > > }
> > > }
> > > [/PHP]
> > >
> > > I even temporarily increase memory (I know, bad idea but it's all I can
> > > think of to do), however, requirements stipulate that files that are
> > > smaller than the max file size (arbitrarily set) be sent via email
> > > attachment (of course, depending on email SMTP server if it gets sent)
> > >
> > > I can't think of any other trick to make this either work or NOT to
> > > time out but throw an error/warning.
> > >
> > > Help!
> > >
> > > Thanx
> > > Phil
> >
> > What exactly are you trying to achieve? What do you mean by "it
> > doesn't work?" Some more details will help us suggest a solution...
>
> At the moment I am able to allow for large files to be broken up into
> an array by not using file() but by using my function above, bigfile(),
> by increasing memory temporarily, so it seems I solved it after all; I
> can accomplish the opening and parsing of larger files this way, so
> thanx!
>
> Phil

I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil


So let me make sure I understand this. You're trying to take a file
that's so large that the normal file handling mechanisims can't deal
with it, then send that massive file as an email attachment?


No trying involved, I can do it now. Just don't use file() but my own
function, bigfile() and temporarily increase memory.

Business requirement, plain and simple.

Phil

Jun 16 '06 #7

This discussion thread is closed

Replies have been disabled for this discussion.