473,763 Members | 6,638 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

can't read large files - help

[PHP]
if (!function_exis ts('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFi lePath) {
@ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_p ath($fullFilePa th), 'r');
while (@!feof($fileID )) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID );
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil

Jun 14 '06 #1
6 2331

comp.lang.php wrote:
[PHP]
if (!function_exis ts('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFi lePath) {
@ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_p ath($fullFilePa th), 'r');
while (@!feof($fileID )) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID );
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...

Jun 15 '06 #2

ZeldorBlat wrote:
comp.lang.php wrote:
[PHP]
if (!function_exis ts('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFi lePath) {
@ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_p ath($fullFilePa th), 'r');
while (@!feof($fileID )) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID );
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil

Jun 15 '06 #3

comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
[PHP]
if (!function_exis ts('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFi lePath) {
@ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_p ath($fullFilePa th), 'r');
while (@!feof($fileID )) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID );
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil


What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.

Jun 15 '06 #4

ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
> [PHP]
> if (!function_exis ts('bigfile')) {
> /**
> * Works like file() in PHP except that it will work more efficiently
> with very large files
> *
> * @access public
> * @param mixed $fullFilePath
> * @return array $lineArray
> * @see actual_path
> */
> function bigfile($fullFi lePath) {
> @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
> MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> ENOUGH UNTIL END!)
> $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
> while (@!feof($fileID )) {
> $buffer = @fgets($fileID, 4096);
> $lineArray[] = $buffer;
> }
> @fclose($fileID );
> return $lineArray;
> }
> }
> [/PHP]
>
> I even temporarily increase memory (I know, bad idea but it's all I can
> think of to do), however, requirements stipulate that files that are
> smaller than the max file size (arbitrarily set) be sent via email
> attachment (of course, depending on email SMTP server if it gets sent)
>
> I can't think of any other trick to make this either work or NOT to
> time out but throw an error/warning.
>
> Help!
>
> Thanx
> Phil

What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...


At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil

Jun 15 '06 #5

comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
> comp.lang.php wrote:
> > [PHP]
> > if (!function_exis ts('bigfile')) {
> > /**
> > * Works like file() in PHP except that it will work more efficiently
> > with very large files
> > *
> > * @access public
> > * @param mixed $fullFilePath
> > * @return array $lineArray
> > * @see actual_path
> > */
> > function bigfile($fullFi lePath) {
> > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
> > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > ENOUGH UNTIL END!)
> > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
> > while (@!feof($fileID )) {
> > $buffer = @fgets($fileID, 4096);
> > $lineArray[] = $buffer;
> > }
> > @fclose($fileID );
> > return $lineArray;
> > }
> > }
> > [/PHP]
> >
> > I even temporarily increase memory (I know, bad idea but it's all I can
> > think of to do), however, requirements stipulate that files that are
> > smaller than the max file size (arbitrarily set) be sent via email
> > attachment (of course, depending on email SMTP server if it gets sent)
> >
> > I can't think of any other trick to make this either work or NOT to
> > time out but throw an error/warning.
> >
> > Help!
> >
> > Thanx
> > Phil
>
> What exactly are you trying to achieve? What do you mean by "it
> doesn't work?" Some more details will help us suggest a solution...

At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!

Phil


I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil


So let me make sure I understand this. You're trying to take a file
that's so large that the normal file handling mechanisims can't deal
with it, then send that massive file as an email attachment?

Jun 16 '06 #6

ZeldorBlat wrote:
comp.lang.php wrote:
ZeldorBlat wrote:
comp.lang.php wrote:
> ZeldorBlat wrote:
> > comp.lang.php wrote:
> > > [PHP]
> > > if (!function_exis ts('bigfile')) {
> > > /**
> > > * Works like file() in PHP except that it will work more efficiently
> > > with very large files
> > > *
> > > * @access public
> > > * @param mixed $fullFilePath
> > > * @return array $lineArray
> > > * @see actual_path
> > > */
> > > function bigfile($fullFi lePath) {
> > > @ini_set('memor y_limit', (int)ini_get('m emory_limit') * 10 . 'M'); //
> > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > > ENOUGH UNTIL END!)
> > > $fileID = @fopen(actual_p ath($fullFilePa th), 'r');
> > > while (@!feof($fileID )) {
> > > $buffer = @fgets($fileID, 4096);
> > > $lineArray[] = $buffer;
> > > }
> > > @fclose($fileID );
> > > return $lineArray;
> > > }
> > > }
> > > [/PHP]
> > >
> > > I even temporarily increase memory (I know, bad idea but it's all I can
> > > think of to do), however, requirements stipulate that files that are
> > > smaller than the max file size (arbitrarily set) be sent via email
> > > attachment (of course, depending on email SMTP server if it gets sent)
> > >
> > > I can't think of any other trick to make this either work or NOT to
> > > time out but throw an error/warning.
> > >
> > > Help!
> > >
> > > Thanx
> > > Phil
> >
> > What exactly are you trying to achieve? What do you mean by "it
> > doesn't work?" Some more details will help us suggest a solution...
>
> At the moment I am able to allow for large files to be broken up into
> an array by not using file() but by using my function above, bigfile(),
> by increasing memory temporarily, so it seems I solved it after all; I
> can accomplish the opening and parsing of larger files this way, so
> thanx!
>
> Phil

I ask the question because you're trying to break it up into an array
of lines -- which suggests that you're doing something with the data on
a line-by-line basis. If that's the case, why not read a single line,
do something with it, then read the next line? Then you don't need to
load the whole thing into memory first.

As I said before, though, it all depends on what you're trying to do.


What I am trying to do is to load the file as an attachment to an
auto-generated email.

Phil


So let me make sure I understand this. You're trying to take a file
that's so large that the normal file handling mechanisims can't deal
with it, then send that massive file as an email attachment?


No trying involved, I can do it now. Just don't use file() but my own
function, bigfile() and temporarily increase memory.

Business requirement, plain and simple.

Phil

Jun 16 '06 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
20516
by: James Johnson | last post by:
I'd like to be able to read a microsoft spreadsheet with php. I know that php can read mysql, so I was thinking that maybe it could read excel spreadsheets as well. Does anyone know if that can be done? Jim
5
3015
by: Paul | last post by:
I'm using Microsoft's ASPFileUpload routines and all works fine as long as my files are smaller than about 200K. For anything larger than about 210K, I get the following error: Error Type: Request object, ASP 0104 (0x80004005) Operation not Allowed On the .Upload method. Has anyone else experienced this problem with ASPFileUpload?
1
2138
by: Charlie | last post by:
Hi: I'm using the HTML File Field control as a file picker for uploading files to a SQl Server image field. When page posts back to initiate upload, if file is small (under about two megs) all is well. However, when I select a large file, the default IIS "Cannot find server" page is immediately returned. Weird??? Thanks, Charlie
2
4431
by: Kalyan | last post by:
hi all, Here is a small doubt with which iam struggling,The problem is how can i read .doc files and display content in browser in php.ok i have done it with file() and file _get_contents() but iam getting content with some meta characters like boxes etc. please solve this problem and reply me immediately it i surgent. Thanks & Regards Kalyan Kumar R
0
1065
by: Ivt22 | last post by:
hi, if someone can help me how can i read all files form a CD one by one and from each file extract all tables in the file once the file ends go and open the next file and on and on... until the last file in the CD. I can do it for one file and the extraction of a table too, so , if anybody can help me... thanks,
3
2993
by: clairePuj | last post by:
Hi everybody, Please can you tell me how can I read large data file (> 15 millions line) in C/C++ languge. Thanks for your help, Claire
2
5453
by: Kevin Ar18 | last post by:
I posted this on the forum, but nobody seems to know the solution: http://python-forum.org/py/viewtopic.php?t=5230 I have a zip file that is several GB in size, and one of the files inside of it is several GB in size. When it comes time to read the 5+GB file from inside the zip file, it fails with the following error: File "...\zipfile.py", line 491, in read bytes = self.fp.read(zinfo.compress_size) OverflowError: long it too large to...
1
4914
by: svchosthunter | last post by:
I using the follwing code to read DOC files but when i try to read pdf files it didnt work How can I read pdf files ? Dim fs As New FileStream(SaveLocation, FileMode.Open, FileAccess.ReadWrite) Dim s As New StreamWriter(fs) Dim d As New StreamReader(fs) d.BaseStream.Seek(0, SeekOrigin.Begin) While d.Peek() > -1 Me.txtbox1.Text &= d.ReadLine() txt_PageContents.Text &= d.ReadLine() ...
0
1124
by: johnnash | last post by:
Is there anyway to do this ? The objects are being modeled using NURBS. How do I go about building reader in C ? Are there any pre built libraries that I can refer to build one myself ?
0
9387
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10148
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10002
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
9938
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8822
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7368
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6643
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
1
3917
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
3528
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.