473,387 Members | 1,485 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

can't read large files - help

[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil

Jun 14 '06 #1
6 2297

comp.lang.php wrote:
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...

Jun 15 '06 #2

ZeldorBlat wrote:
comp.lang.php wrote:
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil

Jun 15 '06 #3

comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.

Jun 15 '06 #4

ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
> [PHP]
> if (!function_exists('bigfile')) {
> /**
> * Works like file() in PHP except that it will work more efficiently
> with very large files
> *
> * @access public
> * @param mixed $fullFilePath
> * @return array $lineArray
> * @see actual_path
> */
> function bigfile($fullFilePath) {
> @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> ENOUGH UNTIL END!)
> $fileID = @fopen(actual_path($fullFilePath), 'r');
> while (@!feof($fileID)) {
> $buffer = @fgets($fileID, 4096);
> $lineArray[] = $buffer;
> }
> @fclose($fileID);
> return $lineArray;
> }
> }
> [/PHP]
>
> I even temporarily increase memory (I know, bad idea but it's all I can
> think of to do), however, requirements stipulate that files that are
> smaller than the max file size (arbitrarily set) be sent via email
> attachment (of course, depending on email SMTP server if it gets sent)
>
> I can't think of any other trick to make this either work or NOT to
> time out but throw an error/warning.
>
> Help!
>
> Thanx
> Phil

What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil

Jun 15 '06 #5

comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
> comp.lang.php wrote:
> > [PHP]
> > if (!function_exists('bigfile')) {
> > /**
> > * Works like file() in PHP except that it will work more efficiently
> > with very large files
> > *
> > * @access public
> > * @param mixed $fullFilePath
> > * @return array $lineArray
> > * @see actual_path
> > */
> > function bigfile($fullFilePath) {
> > @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > ENOUGH UNTIL END!)
> > $fileID = @fopen(actual_path($fullFilePath), 'r');
> > while (@!feof($fileID)) {
> > $buffer = @fgets($fileID, 4096);
> > $lineArray[] = $buffer;
> > }
> > @fclose($fileID);
> > return $lineArray;
> > }
> > }
> > [/PHP]
> >
> > I even temporarily increase memory (I know, bad idea but it's all I can
> > think of to do), however, requirements stipulate that files that are
> > smaller than the max file size (arbitrarily set) be sent via email
> > attachment (of course, depending on email SMTP server if it gets sent)
> >
> > I can't think of any other trick to make this either work or NOT to
> > time out but throw an error/warning.
> >
> > Help!
> >
> > Thanx
> > Phil
>
> What exactly are you trying to achieve? What do you mean by "it
> doesn't work?" Some more details will help us suggest a solution...

At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil


So let me make sure I understand this. You're trying to take a file
that's so large that the normal file handling mechanisims can't deal
with it, then send that massive file as an email attachment?

Jun 16 '06 #6

ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
> ZeldorBlat wrote:
> > comp.lang.php wrote:
> > > [PHP]
> > > if (!function_exists('bigfile')) {
> > > /**
> > > * Works like file() in PHP except that it will work more efficiently
> > > with very large files
> > > *
> > > * @access public
> > > * @param mixed $fullFilePath
> > > * @return array $lineArray
> > > * @see actual_path
> > > */
> > > function bigfile($fullFilePath) {
> > > @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > > ENOUGH UNTIL END!)
> > > $fileID = @fopen(actual_path($fullFilePath), 'r');
> > > while (@!feof($fileID)) {
> > > $buffer = @fgets($fileID, 4096);
> > > $lineArray[] = $buffer;
> > > }
> > > @fclose($fileID);
> > > return $lineArray;
> > > }
> > > }
> > > [/PHP]
> > >
> > > I even temporarily increase memory (I know, bad idea but it's all I can
> > > think of to do), however, requirements stipulate that files that are
> > > smaller than the max file size (arbitrarily set) be sent via email
> > > attachment (of course, depending on email SMTP server if it gets sent)
> > >
> > > I can't think of any other trick to make this either work or NOT to
> > > time out but throw an error/warning.
> > >
> > > Help!
> > >
> > > Thanx
> > > Phil
> >
> > What exactly are you trying to achieve? What do you mean by "it
> > doesn't work?" Some more details will help us suggest a solution...
>
> At the moment I am able to allow for large files to be broken up into
> an array by not using file() but by using my function above, bigfile(),
> by increasing memory temporarily, so it seems I solved it after all; I
> can accomplish the opening and parsing of larger files this way, so
> thanx!
>
> Phil

I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil


So let me make sure I understand this. You're trying to take a file
that's so large that the normal file handling mechanisims can't deal
with it, then send that massive file as an email attachment?


No trying involved, I can do it now. Just don't use file() but my own
function, bigfile() and temporarily increase memory.

Business requirement, plain and simple.

Phil

Jun 16 '06 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
by: James Johnson | last post by:
I'd like to be able to read a microsoft spreadsheet with php. I know that php can read mysql, so I was thinking that maybe it could read excel spreadsheets as well. Does anyone know if that can be...
5
by: Paul | last post by:
I'm using Microsoft's ASPFileUpload routines and all works fine as long as my files are smaller than about 200K. For anything larger than about 210K, I get the following error: Error Type:...
1
by: Charlie | last post by:
Hi: I'm using the HTML File Field control as a file picker for uploading files to a SQl Server image field. When page posts back to initiate upload, if file is small (under about two megs) all...
2
by: Kalyan | last post by:
hi all, Here is a small doubt with which iam struggling,The problem is how can i read .doc files and display content in browser in php.ok i have done it with file() and file _get_contents() but...
0
by: Ivt22 | last post by:
hi, if someone can help me how can i read all files form a CD one by one and from each file extract all tables in the file once the file ends go and open the next file and on and on... until the...
3
by: clairePuj | last post by:
Hi everybody, Please can you tell me how can I read large data file (> 15 millions line) in C/C++ languge. Thanks for your help, Claire
2
by: Kevin Ar18 | last post by:
I posted this on the forum, but nobody seems to know the solution: http://python-forum.org/py/viewtopic.php?t=5230 I have a zip file that is several GB in size, and one of the files inside of it...
1
by: svchosthunter | last post by:
I using the follwing code to read DOC files but when i try to read pdf files it didnt work How can I read pdf files ? Dim fs As New FileStream(SaveLocation, FileMode.Open, FileAccess.ReadWrite) ...
0
by: johnnash | last post by:
Is there anyway to do this ? The objects are being modeled using NURBS. How do I go about building reader in C ? Are there any pre built libraries that I can refer to build one myself ?
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.